What Would a Conservative Social Network Look Like?

Since 2006, Facebook, the world’s most popular online social network, has been accessible and open to membership to anyone with an email address. Over the years, news stories have become integrated into Facebook users’ news feeds.

In 2011, Facebook began to allow news organizations to sponsor stories which would appear in targeted users’ feeds unsolicited. Because so many people (1.79 billion users at last count) utilize Facebook on a near-daily basis, many of them (recent surveys have put the number at 40 percent of all Americans) have come to rely on the site as their primary source for news and information about what’s going on in the world.

This has both positive and negative aspects. It’s convenient to be able to get an extremely wide range of news stories from just one source, and it’s also appealing to get comments and feedback from friends and others about those same stories at the same time in the same place.

However, over the last few years, whether by accident or by design, a substantial majority of users began noticing that much of the news they saw on Facebook had a liberal or progressive slant; there were stories about the environment and global warming, about social justice conflicts and about left-leaning political figures such as Senators Bernie Sanders or Elizabeth Warren. Many users began to rightly suspect that the upper management of Facebook was biased toward progressive political causes and platforms.

This suspicion was confirmed when Facebook Chairman and CEO Mark Zuckerberg acknowledged that he supported and had donated money to liberal Democratic candidates. In 2016, Facebook’s co-founders gave more than $35 million to the campaign of Hillary Clinton, likely because she’d promised them and other tech leaders that she would allow more foreign-born H1-B visa workers into the U.S., which would allow these companies to greatly lower their labor costs (which happen to be Facebook’s largest expense).

But a greater outcry about this issue of bias didn’t come until May 2016, when technology website Gizmodo published a story claiming that Facebook’s news curators routinely suppressed conservative-friendly news stories and sources. This led to inflammatory articles and opinion pieces appearing on many conservative news websites and in right-leaning publications across the country.

Leading conservatives demanded that Facebook take action. In response, Mark Zuckerberg hosted a meeting at Facebook’s headquarters on May 18, in which he promised to correct perceived faults. “I know many conservatives don’t trust that our platform surfaces content without a political bias,” said Zuckerberg following the meeting. “I wanted to hear their concerns personally and have an open conversation about how we can build trust.”

Additional conservative content was explicitly allowed to appear on Facebook, but the company still does not provide numbers of how many people regularly see which content; to some extent, it’s up to a user to subscribe to pages he or she likes to see content from.

Facebook stores data about a user’s preferences and actually labels users “liberal,” “conservative” or “moderate.” As a Facebook user, you can see your own label by going to http://www.facebook.com/ads/preferences and clicking the “Lifestyle and Culture” tab in the “Interests” section.

Facebook’s algorithms in theory will present other content that matches users’ preferences. But there is still anecdotal evidence, particularly from this most recent election cycle, that all users are still being exposed to more left-leaning content than right-leaning material.

Facebook has admitted that even though the company has employed automated artificial intelligence (AI) to present news content and eliminated most of its human curators, there are still real people involved in the news selection process. Facebook has also admitted that its algorithms designed to filter out “fake news” are imperfect, but CEO Zuckerberg has denied that his company was responsible in any way for the election’s outcome.

Less discussed by Zuckerberg was the fact that some of the news sources Facebook promotes are not traditional major news outlets such as The New York Times or The Washington Post; many are third-party websites with names like Mic, Vox, Now This, Think Progress, Raw Story, U.S. Uncut and The Other 98%. What’s also less discussed is that a huge number of these outlets are owned or controlled by Democratic-aligned companies and/or entrepreneurs.

In many cases, they are receiving funding directly from globalist billionaire George Soros through his Open Society Foundations (OSF) organization. It’s also possible that many of these sites are given favorable placement or deeply discounted ad rates by Facebook, but there’s no hard information to confirm or deny this.

In the run up to the election, more than 90 percent of stories posted by The Other 98% were pro-Hillary Clinton and anti-Donald Trump. The Other 98% posted one story per hour virtually every day for the last year. The fact-checking website Politifact (which itself is left-leaning) gave The Other 98% a “50 percent False” rating in its evaluation of the firm’s content.

In addition to posting these stories, sites like U.S. Uncut and The Other 98% have numerous fake Facebook users post comments in support of its posts below each posting. With some routine investigation, it’s easy to see that these users are fake because they have no photos other than their profile picture and their “friends” number less than 10.

When one looks at their timelines, they’re typically filled with either spam posts or 100 percent biased political posts; there are no messages to friends, celebrations of birthdays or discussions of anything resembling a real life. It’s likely that sites like The Other 98% outsource the commentary and creation of these fake profiles to yet another company so that they can conveniently deny responsibility for the proliferation of fake attached responses, many of which are invariably read and absorbed along with the story itself.

Even if the original posting was 100 percent factual, many of these fake users’ comments will make completely false claims and try to “whip up” even more controversy than the original post itself. But most viewers of the comments won’t take the time to see which users are real, which are fake and which comments can’t be trusted.

To make matters worse, if a real user who disagrees with the post tries to add their own comment that’s critical, these fake users will verbally assault them. Many of these attacks appear completely automated because rather than the attacker criticizing anything specific in what the new commenter had to say, they’ll simply utter a generic insult, like “Sure, keep wearing that tin-foil hat!” or “Looks like he’s off his meds again!” Further comments bring additional attackers, so anyone offering merely constructive criticism is quickly shot down, and the posting’s views become less criticized and more accepted.

With social media’s extensive reach throughout the world, the influence of these nefarious tricks cannot be underestimated. They’re essentially the equivalent of a magician on a stage choosing planted audience members to give testimony to sway the belief of a crowd. Such techniques would not be acceptable in real-life political forums, so why are they acceptable online?

Recently, while not discussing the above phenomenon, Facebook has taken steps to give users the ability to see that certain news stories are fake by allowing them to flag stories and have third-party fact-checkers (such as Politifact) vet them. But this is only a partial solution, and likely won’t really be effective for conservatives because the fact-checking organizations also have ties to George Soros’ OSF and other globalist donors.

In the end, what’s likely required is an entirely new social network, built from the ground up to either be truly neutral or to lean to the right as much as Facebook leans to the left. But whichever the case, the intention or goal should be stated upfront and no attempt should be made to fool the public.

It’s not rocket science to build a social network site, but it may not even be necessary; in fact, several virtual clones of Facebook already exist, such as Tsu.co or VK.com, which essentially duplicate 90 percent of Facebook’s features and look-and-feel.

Unlike Facebook, Tsu.co actually pays its members to be on its network (it passes on some of its ad revenues directly to users instead of keeping all of them for itself), so using the site would actually be profitable for users.

There are also other networks such as Diaspora which are 100 percent “open-source” and free from advertising. Almost all of these networks don’t have nearly the amount of privacy compromises and real-name restrictions that Facebook has; users could join under other names, comment anonymously and eliminate ads in some cases.

All that would be necessary for conservatives who want to make their power known would be to organize a mass exodus from Facebook to one of these other networks — preferably on a specific day that would be announced in advance.

A specific target date would have the effect of generating more attention to the move and getting more people to join in; if people are told to do it on a certain day rather than given a vague timeframe, they’ll be more likely to take action.

If such a concept were announced, you can bet that Facebook would respond quickly with more promises to change their ways. But the company shouldn’t be believed; they’ve already been given enough chances to reform their approach. It’s a given that the politics of the company’s upper management almost certainly won’t change and that any further overtures will just be lip service to conservatives.

Mark Zuckerberg has already made enough negative public statements about President-Elect Donald Trump this year; it’s high time he felt the power of action, rather than the heat of words.


Most Popular

These content links are provided by Content.ad. Both Content.ad and the web site upon which the links are displayed may receive compensation when readers click on these links. Some of the content you are redirected to may be sponsored content. View our privacy policy here.

To learn how you can use Content.ad to drive visitors to your content or add this service to your site, please contact us at [email protected].

Family-Friendly Content

Website owners select the type of content that appears in our units. However, if you would like to ensure that Content.ad always displays family-friendly content on this device, regardless of what site you are on, check the option below. Learn More



Most Popular
Sponsored Content

These content links are provided by Content.ad. Both Content.ad and the web site upon which the links are displayed may receive compensation when readers click on these links. Some of the content you are redirected to may be sponsored content. View our privacy policy here.

To learn how you can use Content.ad to drive visitors to your content or add this service to your site, please contact us at [email protected].

Family-Friendly Content

Website owners select the type of content that appears in our units. However, if you would like to ensure that Content.ad always displays family-friendly content on this device, regardless of what site you are on, check the option below. Learn More