jakelazaroff 5 years ago

I'm glad the attorney general is getting involved. We need to start charging Facebook execs for these flagrant privacy violations. They're being fined 3 billion dollars for legal expenses relating to an FTC inquiry… and their stock price went up by 8% [1].

The market just does not care; it's time regulators and law enforcement started to.

[1] https://www.barrons.com/articles/facebook-stock-is-up-becaus...

  • jonahb 5 years ago

    The market cares. It had already priced in a fine, possibly one larger than 3B. The reduction in uncertainty -- knowing the magnitude of the fine -- also positively affects the stock price. If the fine were cancelled tomorrow, you can be sure that, all else equal, Facebook's market cap would jump by 3B.

    The fines need to be bigger.

    • baroffoos 5 years ago

      They need to increase the fine substantially for each repeated violation. After a few violations its going to be hard to justify a $50B fine to investors.

      • omeid2 5 years ago

        The fines also needs to be fair, not fixed cost. I think the EU has got the percentage-of-revenue approach right. A million dollar fine for a small company can be death-sentence while for a big player it is lose change, that is not fair.

        • cobbzilla 5 years ago

          I agree that percent of revenue is a much better metric to base these kinds of fines off of, but not without reservations -- a small company with high sales revenue but razor thin margin could get killed by a % revenue fine; this seems unfair to the little guy.

          On the other hand, if the fine is % income or anything but % gross receipts, then of course the system will be gamed endlessly by accountants and lawyers to show the smallest possible net number. The end result might be worse than a fixed fine. So % of revenue it is.

          • WrtCdEvrydy 5 years ago

            > So % of revenue it is.

            GDPR, 4% of Global revenue and your Directors can be barred from operating in the EU.

      • kerng 5 years ago

        Totally agree, at the third time there is really no excuse anymore to not bump this up to maybe even threaten the existence of an organization or it changes its behavior.

        For FB this is the nth time. I don't even know what n is anymore.

        • acct1771 5 years ago

          Neither do any of the regulators...

      • rhizome 5 years ago

        No, they need to adjust the first-time penalties for deterrence. $50B at a minimum, because it has to hurt their stock price when they do these things. Or hey, how about a year of revenue.

    • Simple_Guy 5 years ago

      The fines should be 100 billion. Make them itch a little.

    • chongli 5 years ago

      Yes. The fine needs to take into account how much Facebook would stand to lose if the case went to trial and they were found guilty, setting a precedent for future actions against them. That seems like it could result in near-unlimited liability for Facebook. The fine for avoiding an admission of wrongdoing should be set equal to the expected value of all that liability, based on an estimate of the probability of winning the case.

      • throwaway5167 5 years ago

        There's no way the government has the same resources as Facebook to make that case. The FTC loses when it goes to Court. Facebook would have more lawyers on the case than the FTC has paralegals total.

        • chopin 5 years ago

          And that guarantees victory? Interesting judicial system you have, over there.

          • luckylion 5 years ago

            Where is that fundamentally different in other countries for very large (especially domestic) players?

            Will Germany actually take a swing at Volkswagen? Will Germany try to go after those that defrauded the government with cum-ex-trades?

            Everybody vs banks regarding the Libor scandal? Sure, they paid some fines, but the fines were far below the damage done, and it appears to still have been a good investment for the banks, that is: all in all they made money, fines included.

            Crime does pay at that level.

            That's not something US specific, I believe.

            • chopin 5 years ago

              I was referring to the fact that you can outpay your opponent.

              Yes, large corporations can get away with much more in other countries as well, but rarely by outpaying their opponents.

              • acct1771 5 years ago

                Disagree. The money is very important to creating and maintaining the position they sit in.

  • Causality1 5 years ago

    Personally I think monetary fines are a fundamentally flawed form of punishment. Send. Decisionmakers. To. Prison. I don't care if it's for just a single day but march them into a courthouse in an orange jumpsuit and handcuffs.

    • spinningslate 5 years ago

      This. Whether jail or some other sanction, hold decision makers personally accountable. Fines have their place, but - at least for a firm of FB's size - they just get priced into accounts.

      The FB/Zuckerberg combination is almost unique in terms of (1) market cap, (2) the controlling ownership he holds, and (3) behaviour of the firm.

      Facebook is reputedly pedalling furiously behind the scenes to prevent Zuckerberg himself being charged. Whilst saying almost nothing about the fines. That says everything about where they see the real danger.

    • marcosdumay 5 years ago

      Did they commit any crime?

      I do agree that fines are overused, but I don't think there's reason for sending anybody to prison on this case.

      • Causality1 5 years ago

        Why is Facebook anticipating a five billion dollar fine if they didn't break any laws?

        • marcosdumay 5 years ago

          Breaking laws is not the same as comiting crimes.

  • hjk05 5 years ago

    No matter how the market reacts 3 billions is still 3 billion. People need to stop trying to look for justice in the market trends, investors are not pricing facebook based on how much cash they have in the vault right this instance, and if a verdict comes in below expectations the price is bound to rise.

    Could you imagine the absurdity of them passing down a verdict of “and they shall be fined a billion per month until their stock price falls by at least 10%!”

    • save_ferris 5 years ago

      The purpose of the fine is to discourage FB from engaging in illegal and unethical behavior. The problem here is that FB sees a $3B fine as the cost of doing business, since they surely made much, much more than $3B engaging in these invasive, user-hostile tactics. If they know that regulators will only come after them for an arbitrarily low percentage of revenue generated from these activities, why would they stop doing them?

      If a punishment no longer deters bad behavior, then it's no longer a punishment.

    • colpabar 5 years ago

      I think the point is that we need to punish companies like Facebook in ways that might actually incentivize them to behave differently. If they get slapped with a fine that they can easily pay and then see their stock price go up, what's the point?

  • kevin_thibedeau 5 years ago

    What laws have they broken? Without laws to break there isn't much justification for punishment.

    • kerng 5 years ago

      Theft of personal information seems the most obvious one that they could be charged with. Unauthorized access of email accounts and exfiltration of information is something hackers have gone to jail for.

      • joering2 5 years ago

        Single instance can put you 20 behind bars. Zuck would be computed 250,000 years for all the mass email access hack even if it cleverly asked users to actually gave passwords in at their own free will. But to semiquote: you kill one men thats a murder, you kill million men thats statistics.

    • doodliego 5 years ago

      The Computer Fraud and Abuse Act (CFAA) prohibits accessing a computer without authorization, or in excess of authorization.

    • spaceheretostay 5 years ago

      Facebook have broken a huge list of laws. But recently from my reading I think the current issue is a breach of contract. They signed a legal contract with the FTC and then broke it, hence the fine.

      There doesn't seem to be any debate that Facebook broke the law - just debate about what the punishments should be.

  • orijing 5 years ago

    Is it a coincidence that the stock price would move on the day when FB releases earnings?

  • SheinhardtWigCo 5 years ago

    Not just execs, I hope. The engineers who wrote the code and managers who told them to do it should also face justice.

    • kerng 5 years ago

      Interesting point, at first it sounded a bit harsh but thinking a bit about this.... in the end as software engineers we are the ones pulling the trigger and your statement highlights our ethical responsibilities.

      The problem though is that software engineering doesnt require any licensing requirements, etc. so one can always refer to incompetence being the problem (ooops a bug) and might get away with it.

      • throwaway_9168 5 years ago

        Well, unlike other lowly developers, FB engineers know exactly what is and isn't the right kind of mistake to make in pursuit of a company's vision.

        Someone who works at FB even came out and said so:

        "Remember, what Facebook is doing has never been done before. There are going to be mistakes."

        https://news.ycombinator.com/item?id=19321420

        So yeah, I think what that person is saying should be translated as:

        "Please don't think twice about punishing rank-and-file FB engineers. We are so competent, all our mistakes have only been intentional ones."

    • orijing 5 years ago

      Have you never committed a bug before?

      > A Facebook spokesperson said before May 2016, it offered an option to verify a user's account using their email password and voluntarily upload their contacts at the same time. However, they said, the company changed the feature, and the text informing users that their contacts would be uploaded was deleted — but the underlying functionality was not.

      I doubt it was an engineer who deliberately removed the text but kept the contact import functionality.

      • spaceheretostay 5 years ago

        > Have you never committed a bug before?

        Engineers who make mistakes that harm people are still responsible for the mistakes they made. You cannot just claim "it was a bug" and get off scot free if your code harms someone or otherwise breaks the law. Also there's no need for this sarcastic tone, "have you never..?"

        > I doubt it was an engineer who deliberately removed the text but kept the contact import functionality.

        Why would you doubt that? I personally think that situation sounds quite likely. But either way we're just speculating.

        Also, don't ignore the part of the parent comment that discusses the manager's (and implied other decision markers) that result in the decision being made to make an illegal change to the code.

        Engineer, or manager, or QA assistant - someone or some group of people will have made the change. And "oops that was a bug" doesn't count. Corporations and their employees must be held to the same laws and standards to which the rest of us are held. "Ooops I didn't mean to do that" doesn't fly as an excuse to break the law.

    • droithomme 5 years ago

      I agree, but with conditions. PEs have ethical responsibilities. If the engineers are licensed, yup, they absolutely have a moral and legal responsibility to refuse to sign off on illegal work. In most states though, software engineers are not professionally licensed, certified, or ethically trained. Often they are referred to using demeaning terms such as "coders", which are equated as "bricklayers" by thought leaders such as Steve McConnell and others. Thus it would not necessarily be reasonable to hold them to a responsibility refuse to follow illegal orders they were given as mindless "bricklayers" and "coders" etc, which is the most common paradigm of software engineers. Unintelligent implementors who are simply typing things up on command with a surfeit of semicolons can not be held to any sort of professional responsibility. But PEs absolutely can and must.

    • baroffoos 5 years ago

      That would require all developers to become lawyers so they can make sure everything they are being asked to do is legal.

      • jakelazaroff 5 years ago

        Are you saying that we can only expect lawyers to follow the law? I have no legal training, yet I manage to avoid breaking the law every day. Why are developers special here?

      • spaceheretostay 5 years ago

        We don't all need to be lawyers to understand the basics of our laws. Knowledge of the existence of the law is assumed for all citizens. Ignorance of the law has never been a valid excuse when the law is broken.

        (Yes, the US does have some new problems with private laws being passed, where the public is not allowed to know the law itself - and this opens up huge scary things. But that is not what is being discussed here and I don't believe is relevant.)

      • wavefunction 5 years ago

        What's wrong with introspection about what you're doing on an ethical level? There's no legal standard involved so you don't have to be a lawyer, just a bit more thoughtful.

        • baroffoos 5 years ago

          The law isn't just ethics. Companies have whole legal teams because laws are so complex. Should a developer be responsible because the cookie banner they implemented wasn't compliant with the laws of 100 countries even though the legal team already told them it was ok?

          • spaceheretostay 5 years ago

            > Should a developer be responsible because the cookie banner they implemented wasn't compliant with the laws of 100 countries even though the legal team already told them it was ok?

            Yes. 100%. "I was just following orders" is not a valid excuse, ever - Nuremberg is the obvious extreme example, but it's true everywhere.

            • tzs 5 years ago

              In fact, "I was just following orders" often is a valid excuse. It didn't work at Nuremberg because it was an extreme example. The orders there were to do things that could not even conceivably be legal, so those who carried them out were considered to have knowingly acted illegally.

              When the orders are to do something that is plausibly legal, and you have good reason to believe that it is in fact so, "I was just following orders" will probably work in most jurisdictions.

              • pluma 5 years ago

                Iff they have confirmation from their product lead that what they're doing is perfectly legal and it isn't obviously illegal, I agree that there's no liability.

                If it's either obviously illegal or it's clearly at least dodgy and they didn't get explicit confirmation from the project lead, "following orders" is not a valid excuse.

                To take the VW case as an example: if your project lead tells you to implement a way to recognise test conditions and adjust the performance to reduce emissions, that is dodgy af and you should at least get confirmation that this isn't illegal (i.e. that it's not intended to cheat on certifications but maybe just for certain internal testing scenarios). In the end the entire chain of command that led to this being implemented is guilty, but if the person implementing that behavior knew what they were doing was illegal or at least suspect and they didn't get confirmation, they're still guilty.

              • spaceheretostay 5 years ago

                None of what Facebook has done here sounds plausibly legal to me. I'm not a lawyer, but this stuff is plain-as-day immoral and illegal from my eyes.

            • thecatspaw 5 years ago

              So a developer should not trust a huge legal team, who's purpose it is to verify all is legal?

              How do you expect the developer to figure out its illegal when a team of lawyers couldnt?

          • wavefunction 5 years ago

            My point is if you err on the side of ethics you likely would never need to worry about the law. I am fully confident the vast majority of people can determine when they're doing something unethical. I am not saying people will not undertake actions they know to be unethical, but avoiding unethical actions will protect that person from exposure to legal liability.

          • pluma 5 years ago

            This is a ridiculous example because it equates clearly immoral and possibly illegal actions with legal trivia (because a hypothetical country may simply require a certain wording for compliance and it's trivial to mess that up).

            Facebook claims to "value privacy" (and some devs have even told me that they value it "more than any other company") but their actions consistently show either neglect or outright abuse.

            Should a developer be punished for implementing something illegal that the legal team signed off on and that wasn't obvious illegal? No, because the legal team is supposed to take the responsibility and if it wasn't obviously illegal the developer had no reason to assume the legal team was lying.

            Should a developer be punished for implementing something obviously illegal even when the higher ups say "don't worry about it"? Yes. If your boss tells you to rob a bank, you still go to prison for bank robbery.

            For everything in between: whistle blowing is a thing. If you suspect something fishy is going on, document everything, raise concerns and report what is happening.

            Also if you are a well-paid employee in a position where you can easily find another job in the industry, speak up to your superiors and refuse to be complicit. Organise.

    • OrgNet 5 years ago

      You really think execs will see consequences (besides fines)?

mehrdadn 5 years ago

Did anything happen regarding LinkedIn's email harvesting from before when Microsoft bought it? I feel like that was far worse than Facebook's.

  • kerng 5 years ago

    They settled a class action lawsuite I think, but if I remember there was some form of consent involved that emails will be leveraged (probably not obvious either, and shouldn't be done). In Facebook's case it has clearly been misleading, possibly intentionally multiple times now (email, phone numbers,...).

    • zamalek 5 years ago

      I didn't like it (which is why they didn't get my damned password), but they were pretty open about what they planned to do with your credentials.

      The problem isn't how open they are, it's that most people don't understand what the harvesting means. Facebook could have asked for the sacrifice of the firstborn and people would have snapped it up on the prospect of a few likes on their fake online alterego. It's human nature and the HN echo chamber exists far outside that normalcy. Most people don't "get it" (through no fault of their own) and that's why it's dangerous [edit] and effective.

      • simion314 5 years ago

        >I didn't like it (which is why they didn't get my damned password), but they were pretty open about what they planned to do with your credentials.

        From what I read FB had a "bug" where the feature was not removed properly, so the text about harvesting was removed but by "mistake" the harvesting code was left running.

        • pluma 5 years ago

          I think what happened was that they had an "import friends from your email contacts" feature in the past and the code was reused for "verify your identity via email" but they didn't realise the code would still also upload the email contacts.

          At least that's the story I've heard about why this was an "honest mistake".

dschuetz 5 years ago

Why is "we will never ask for your passwords at any time" not a thing anymore? What Facebook did was phishing, basically. With the password Facebook could be doing a lot more rather than just "upload contacts". Imagine those passwords landing (or accidentally leaking) into the hands of third-party services Facebook is working with! On the user end, what happened to "never ever give away your passwords"? I mean, that's why spear-phishing is so successful, because naïve people give away their passwords in the hopes that this darn annoying login-screen goes away.

  • javagram 5 years ago

    Asking for passwords has been an industry standard for a decade or more. Bad security practice? Yes IMO but companies have been getting away with it.

    Besides Facebook I can think of LinkedIn and Mint as two big examples of SaaS that ask for 3rd party passwords. Mint is even getting your banking information, whereas LinkedIn and Facebook were just doing contact import.

    And of course before the era of SaaS giving applications passwords was normal, e.g. putting your email passwords into an email client like Eudora or Thunderbird. It only really becomes questionable in SaaS where the passwords inevitably end up on a server somewhere subject to a data breach, or, in Facebook’s case, misuse by another piece of its own software.

sidcool 5 years ago

This is probably the third investigation against FB this week. How do we still continue trusting it with our personal data? I have been equally stupid in sharing freely on FB/Instagram, not anymore.

  • jbharter 5 years ago

    After years of less frequent use, I finally quit after the 2016 election. I'm so much better off without it, you can do it too!

  • Funes- 5 years ago

    What about WhatsApp? Virtually no one in my country--not even the elderly, who now broadly use the app as well--is willing to abandon it, no matter how its parent company is depicted. In fact, last time I checked, around 75% of all population use the app. I don't, and thus I'm paying the price socially--mentally, I'm better off; it comes without saying.

    • midasz 5 years ago

      Same - I don't really speak to some people anymore because I moved away from WhatsApp. The one's I do really want to speak to (family) have installed both apps now, simply because I messaged the group saying "I'll be moving away from WhatsApp - you can reach me through text, calling, and telegram" and just left/uninstalled WhatsApp. It can be that simple.

orijing 5 years ago

The article claims the practice "was uncovered by Business Insider last week", implying FB was being sneaky about it. But if you look at the Business Insider article (https://www.businessinsider.com/facebook-uploaded-1-5-millio...), you'll see this:

> A Facebook spokesperson said before May 2016, it offered an option to verify a user's account using their email password and voluntarily upload their contacts at the same time. However, they said, the company changed the feature, and the text informing users that their contacts would be uploaded was deleted — but the underlying functionality was not.

> "Last month we stopped offering email password verification as an option for people verifying their account when signing up for Facebook for the first time. When we looked into the steps people were going through to verify their accounts we found that in some cases people's email contacts were also unintentionally uploaded to Facebook when they created their account"

so Facebook discovered this bug in an audit of its code, fixed it, and planned to notify everyone who was impacted.

  • kerng 5 years ago

    Can we please stop calling these privacy violations bugs? It sounds like a benign thing. These are not bugs anymore. It's unauthorized access to records of millions, and Facebook is the one who performed the violation.

    I can give a dog walker or cleaning personel the keys to my apartment, still if they steal stuff and I have evidence they will be prosecuted. It's not a bug that they don't have business ethics.

    • product50 5 years ago

      So a hacker took all of Equifax's data including your SSNs, address, names, DOB etc. By your analogy, all of Equifax engineers should be in jail right now!

      BTW, just in case you are unaware, Equifax got away with this hack with zero fines in US.

      • kerng 5 years ago

        Your are mixing things up.... In this situation the hacker is Facebook.

        Most of the other Facebook data breaches where they didn't secure data accordingly would compare more to what you refer to.

        This case is different though as Facebook performed unauthorized actions on email accounts, basically breaking in.

        • product50 5 years ago

          I am making a case for the OP's comment that Facebook may have made a genuine mistake by introducing this bug - like they literally called out in their statement.

          A bug is a bug. Whether it allows a hacker to sneak in to steal all your data or whether it allows a company to collect data it wasn't supposed to (as in this case Facebook specifically mentioned that it didn't turn off the feature though it intended to).

          • jimsmart 5 years ago

            > in this case Facebook specifically mentioned that it didn't turn off the feature though it intended to

            What you are describing here is in fact a lack of action, or a lack of change policy (to cause such action). That's not a bug. A bug is unintentional behaviour of some code, not some folk who've said they'll do something, but then don't.

            And as for whether the original behaviour is/was a bug is also a point of contention too: that's a lot of willfully bad behaviour that's got chained together somehow to do what it did, then reviewed, signed off, and deployed — that's quite some 'accident' — I write code, and to me this whole thing just smells of a cover-up (by FB calling this a 'bug', when it very much looks to be otherwise).

    • orijing 5 years ago

      I'm curious, if the message saying that "FB will also import contacts if you proceed" were still visible, would you still consider it "unauthorized access"? Is it really "unauthorized" if users give informed consent?

      I doubt it, so it seems that we're just bickering over whether the accidental removal of the message is considered a "bug" or a malicious act by some engineer to trick users into sharing their data because they (and their company) lack business ethics.

      Which is more likely?

      • kerng 5 years ago

        Move fast and break things is not what one should do when dealing with personal information of billions of people. People need to be held accountable, Facebook has to be held accountable.

        Maybe a complete engineering stop for a few months, and development of new practices and processes.

        Similar to what Microsoft did with Bill Gates Trustworthy Computing memo which led to the creation of the Secure Development Lifecycle is something Zuckerberg should order to do.

  • bryan_w 5 years ago

    Yeah, this seems like punishing FB for being too honest. There was no technical reason to disclose the bug. I mean if they just quietly deleted the data that they didn't mean to collect, it doesn't seem likely that anyone would even notice.

busymom0 5 years ago

Are there screenshots or something of this "asking for email password" thing the article talks about? I feel like anyone who sees a facebook page asking for their email password should already feel a bit warned and skeptical. I had personally never seen such a thing until 3 years ago when I deactivated my account. Is this a new thing?

droithomme 5 years ago

So this relates to their practice of ransacking all your email contacts without your consent, engaging in data theft as they upload them and analyze them for subsequent actions. And of course Linked In is also notorious for engaging in this criminal vile privacy raping practice.

I remember maybe 8 years ago it was here on HN that a company was found to be doing this and it was shocking to some. But an executive of that company showed up here and said that "everybody does it" and it's "standard practice" and for some years after that anyone complaining about the practice here was not only downvoted but sometimes warned by our glorious compromised moderators.

In my opinion the practice of contact list ransacking should never be allowed, is clearly unethical, and anyone defending it is an enemy of humanity who should be locked away in order to protect society.

  • blitmap 5 years ago

    You know someday when we have consumer protection laws for things like this, a company will be forced to trace every piece of derived data on a user based off of what was unlawfully collected - and provably delete it. Instead of simply paying a fine and moving along with the new insights they have on you. Like an entrance fee to the club.

    Sidenote: I bet you could make a business out of tracing derived data to comply with lawful orders.

    • radicaldreamer 5 years ago

      It’ll be very hard because it can be impossible to trace a model back to it’s constituent data, especially if it’s ephemeral

      • blitmap 5 years ago

        Well see that's the other thing: "Someday" they'll have to make these connections in case they are legally obligated to erase a data source and it's derivatives.

      • groestl 5 years ago

        Ephemeral is not an option then, isn't it? In face of regulation, some things can't/shouldn't be optimized. It's already the case for more established industries like pharma.

      • yati 5 years ago

        One way is to make sure the provenance is known from source tuples to all the way to the model version, and (eventually) retrain the model when any records upstream of it are deleted.

        • groestl 5 years ago

          AFAIK Palatir made a business model out of this (i.e. tracking provenance for datasets and models, along with permissions).

          (Just hearsay, not affiliated with Palantir in any way)

    • robin_reala 5 years ago

      We do (mostly) have these laws: it’s called GDPR. Unfortunately this doesn’t apply to everyone, but the good thing is that it’s forcing the internationals to put the software frameworks in place to deal with it ready for when the US decides to join in.

      • antt 5 years ago

        It's also unfortunate that law makers for 20 years decided that online isn't real so what happens there doesn't really matter.

        • tlb 5 years ago

          You wouldn't have liked it if the government started regulating the Internet when it was young. There were proposals in the early 90s that it should be regulated like broadcast radio. So no 7 deadly words, for example. It was good that the government left it alone until its effects on society were better understood.

          • antt 5 years ago

            I remember the crypto wars.

            The US government has always controlled the internet. It just made the strategic decision that having a 'free speech' platform accessible to the thought leaders of the 21st century between 1980 and 2010 dealt more damage to its enemies than to itself.

            Now that the internet is reaching the majority of people it's gotten worse, more regulated and less free. One only need to look at the 2016 election with it's 'fake news' and compare the US reaction to that of, say Iran in 2009 and see very little difference in rhetoric between the two countries.

  • jmspring 5 years ago

    LinkedIn is notorious for this. The app still asks for contact access all the time.

    • anitil 5 years ago

      Off topic, but I have to ask - what do you use the app for? In my head LinkedIn is a 'work' thing so I only use it on my laptop.

    • cortesoft 5 years ago

      They are the worst... every few months, I will end up with like 20 emails at once asking to connect with a coworker on LinkedIn. They will be to 20 different email lists that I am on, which all get sucked up by LinkedIn and spammed. I don’t even have a LinkedIn account.

    • helloindia 5 years ago

      The worst is, even if you deny Linkedin access to your contacts. Your contacts will still show up in suggestions, because others have givein LinkedIn access to their contacts. And LinkedIn maps it with your Linkedin signup email address.

    • rags2riches 5 years ago

      Saying that the app asks is not accurate. It's not presented as a choice and it's not clear what will happen when you continue.

happppy 5 years ago

Facebook stock is going up regardless of so many scandals.

garbonicc 5 years ago

just.stop.using.facebook

  • dang 5 years ago

    Could you please stop posting unsubstantive comments to Hacker News?

    • garbonicc 5 years ago

      what are you some kind of fascist

      • dang 5 years ago

        Ok, since you don't seem to want to use this site as intended, I've banned the account. If that changes, you're welcome to email us at hn@ycombinator.com. We're happy to unban anyone who gives us reason to believe they'll follow the site guidelines in the future.

        https://news.ycombinator.com/newsguidelines.html

cronix 5 years ago

This could be fixed fairly easily. "Hey Russia, if you're listening, it sure would be nice to dox Mark Zuckerberg and all of his contacts." FB would think a bit differently about privacy. Dumb *ucks.

yeah1234 5 years ago

With all these high profile big dollar lawsuits it’s almost as if New York is grasping at straws to find funding.

sonnyblarney 5 years ago

Please investigate 'everyone' for this.

product50 5 years ago

This might have been an honest mistake. To do this for 1.5M users only when Facebook's user-base runs into billions might indicate that.

  • jimsmart 5 years ago

    > To do this for 1.5M users only when Facebook's user-base runs into billions

    Well, put like that, it sounds just like they did it as a planned A/B test strategy (like they do to trial other features) — and, personally, I believe this to be the case.

    Deploying such code/functionality is hardly an accident/bug.

  • OrgNet 5 years ago

    You had to give them your email password when you signed up...

  • shard972 5 years ago

    How do you make a form to ask for people's email password by mistake?