Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Imagine if it wasn’t a Fox News talking head, but an incendiary video built on falsehoods being spread to incite violence - or what if it’s 2011 again and ISIS beheading videos are being shared and supposing the evidence showed they were causing ISIS recruitment to increase?


I would still have a proper judge made a decision than Facebook. I can easily block people that share BS on Facebook but I don't want Facebook to make that decision for me. That overrides my own choice to follow them in the first place.


Let's be more specific: you're in Myanmar (aka Burma), and there are Facebook groups calling for, and even organizing, massacres of the Rohingya minority. We should wait for a judge to shut that down? When the relevant judge is part of the people organizing the massacres? And no, this isn't hypothetical, this happened, and it happened on Facebook.


Is not about stopping the post of fb anymore but arresting these people. Yes a judge should make that decision.


The state of Burma allowed the massacres though. You can't trust a judge to shut it down.


After that episode Facebook should have been subject to an immediate inquiry and then US courts should have ordered FB to not operate in Burma/Myanmar until further notice.


Well so you want FB to step outside law and filter content. I don't see a way they can be right here.


What is a “proper judge”? You mean you want the courts to moderate your social platform?


> You mean you want the courts to moderate your social platform?

Honestly? Yes. At least that way there's impartiality, accountability, an appeals process, and enforcement.

If Facebook self-moderates you get all of the same downsides of "big government"[1] moderation and none of the benefits listed above.

[1] I assume your argument is predicated in terms of "big government" as though an unregulated and for-profit company having a near-monopoly on key parts of modern society is somehow superior to any kind of state involvement in reigning-in excesses that the free market fails to address.


> You mean you want the courts to moderate your social platform?

In situations regarding literal terrorist propaganda, and active calls for violence (Which were the examples given), which are already illegal?

Yes. The courts are the proper place for determining how literal terrorism/imminent threats of violence should be handled.

I don't think this is controversial, to say that people who make imminent calls to violence, as has been already defined as being illegal by the court system, should be handled by the law.

Most everything else should be not blocked by the platform, though.


That should be OK. If my friend wants to send it to me, why shouldn't he be allowed to?


On an individual basis - one-to-one communication, sure.

But on a publishing platform where a posted article can have millions of viewers with no connection to the author who would miss out on important context... that won't end well.

There's a difference between Facebook facilitating private communication between individuals and small groups with inherently limited information-spread (e.g. phone calls, emails, IM) and Facebook operating a publishing platform that allows for mass communication. The problems we're seeing today stem from that very same mass-communication publishing platform being used as a state-level propaganda tool to sway public opinion (e.g. Russia discouraging Dem-leaning voters in 2016) at one end, to Facebook knowingly allowing and facilitating extremist groups to operate on their platform and coordinate real-life terroristic assaults at the other end.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: