Interesting that this quote was initially about stock options at tech companies. It turned out that stock options did become nearly universal in tech compensation, and companies that granted them outcompeted companies that did not. So the management that was ostensibly “doing a massive blag at the expense of shareholders” wasn’t really, time vindicated their practices and things like option backdating and not treating them as an expense weren’t even really necessary, but it took a few years. It wasn’t obvious in 2002 that this is how it would play out.
And relevant to the title quote: maybe it should be amended to “good ideas do not need a lot of lies to gain public acceptance eventually”. The dynamic here is that a significant part of public opinion is simply “well, this is how things work now, and it seems to be working”, and any new and innovative idea by definition is not going to be how things work now. The lies are needed to spur action and disturb the equilibrium of today. But if you’re still telling lies a few years in, you’ve failed and it’s a bad idea to begin with.
> stock options did become nearly universal in tech compensation
Although I've noticed that options have been replaced more and more these days with RSU's (plain old grants) because options have a tendency to go "underwater", suggesting that they weren't all that great to begin with.
Right, they go underwater precisely when the company is not doing well and you are at greatest risk of losing the job. That's not a great risk profile.
Options have some minor value in signalling that you're a true believer. You should in fact care only about base salary, but not telling the people doing the hiring that can be quite useful. Doing a fake come-down on base in exchange for options shows you are invested and surely worth hiring.
Almost any useful innovation is going to have a right tail of people who overhype it. They shouldn't, and I wish they wouldn't. But if your strategy for evaluating new ideas is to find the biggest sources of hype and fact check them, you're going to systematically undervalue innovation.
For me the danger of AI is that it enables the surveillance state through facial recognition and the instantaneous aggregation of all my data. For "national security" reasons, I may be detained and denied of my rights if Palantir hallucinates. Who do I sue if Palantir decides I am an illegal?
The thing is a government never needed technology to be authoritarian. The government today already has all the tools to ruin your life. It had them in 1940. It had them in 1840 and it had them in the year 40 as well. And that tool is known as the monopoly on violence. It can be wielded in many ways good and bad.
The last line of GP's comment is key here: "Who do I sue if Palantir decides I am an illegal?"
This shouldn't make as much of a difference as it does, but due to how our legal system works, it's much harder to get meaningful legal satisfaction when an algorithm (or other inhuman distributed system) commits a crime against a person than when a person does so.
Ot worse because it didn't hallucinate, and they are coming for you, as a free thinking "radical". They can tell from a long deleted blog post you made in 2005 about green energy.
Why bother with all that though? Just ask them to do their job for the party. If they don't, or you suspect they don't align with the party, you just execute them. Don't need tech for this. The tech is just for some people to get rich, not to really enable any new evil that can't already be achieved today with pen and paper and bullet (as modeled extensively in the last century).
Put it this way, if Hitler had grok, would it really get any worse for the Jews? I don't think so. I think they would be screwed no matter what.
Because you can't do the Nazi Germany thing these days. I mean... disgust aside, it kinda failed. But you can spy on people under "national security" while keeping them feeling happy enough. And that arrangement can last 1000 years.
Still not convinced that AI is offering anything new here. Especially when the statistics you'd reach for are often like 100 years old or more. Bayes theorem is older than the united states. I think among lay people there is a lot of conflation between AI and statistics, and also a lack of understanding of the state of that field and how mature it is. Nazi Germany of course heavily used statistical modeling and even contracted with IBM to quantify Jewish populations.
This is what scares me the most about AI. You have a handful of really big companies trying to outdo each other as they race to implement it and deploy it as quickly as possible.
To try and justify their outrageous capital spending on data centers; they are incentivised to exaggerate its current capabilities and also what it will be capable of 'soon'.
There is no time to evaluate each step to make sure it is accurate and going in the right direction, before setting it loose on the public.
I guess a counterpoint might be Apple's "strategy". Scare quotes because I truly don't know if it was deliberate or just a happy accident. But somehow they've managed to not get so intensively exposed to the downside risk--if the wild claims about AI don't pan out they're not going to lose very much compared with the other megacorps.
It is also a useful trick to keep in mind the opposite of critical thinking - following the herd. Just copying everyone around you is often a great strategy. So good that even if everyone around you are making mistakes it can still be the dominant strategy (there is a reason a lot of people who don't like war are cowed into silence when war fever descends). Most people are using it.
That implies that it is ridiculously easy to be right when everyone else is wrong. People aren't trying to be right. Any sort of principle-based analysis easily outperforms the herd. When leaders in society start lying that is indeed one of those situations. Pretty much any situation where everyone knows something and the hard statistics are telling a different story is.
The more pressing problem is how to go from a lovable Cassandra to someone who can preempt major events and convince the herd to not hurt itself in its confusion. Coincidentally that is how markets work, people who have a habit of being right are given full powers to overrule the mob and just do what they want. Markets don't care if everyone believes something. They care if people who got the calls right last time believe something.
In this case, the US hasn't seen a good outcome to a war since something like WWII and even there they waited until the war was mostly over and the major participants in the European theatres were exhausted before getting involved. The record is pretty bad. Iraq was an easy call to anyone who cares about making accurate predictions.
> That implies that it is ridiculously easy to be right when everyone else is wrong
I think this true but misleading — conditioned on other people going with the herd in the wrong direction, it is easy to be right. However, often the herd is going in a right (or at least acceptable) direction. The continual effort to check if the herd is going in the right direction _is not_ easy. If a magic eight ball could alert you “hey the herd is wrong right now, take a closer look”, that would be great! But we have no such magic eight ball.
I have experience in public advocacy advertising. My short opinion is this: respectfully I disagree. Coal energy, ok, good idea in principle - folks love energy but yeah, not hard to see it's not great for the environment. Solution for the coal industry: advertisements that say "we wash our coal", and everyone is ok. Washing coal = less environment impact is clearly a lie. Good ideas <> lots of lies is too simplistic a concept. What's good for you and me isn't necessarily good for everyone. It's a complex world. Public acceptance is a complex subject. At risk of getting flagged... think about a "Make HN great again" campaign. What comes to mind ;-) Public acceptance <> good for society..
p.s. If history teaches something - it’s that public acceptance is something sometimes manufactured. On the Internet nobody knows you are a dog ;-) (queue the actual humans responding to my post... 3.... 2... 1...)
I think you just reinforced the articles point. Coal power needs lots of lies to justify it, as per your own statement.
That is in fact because coal energy is a terrible idea. It has 0 upsides compared to renewable alternatives, and is on the whole worse than even other non-renewable alternatives.
If you have to lie to make it sound good, that's probably because it isn't actually good
Well there's a survivor bias that I think plays into the quote.
If its a good idea that's obvious, it's already used widely. If its not obvious, you'll still have to convince people. None of that requires lots of lies, though.
Burden of proof is on the cucks who ever believed a simp like Dubya in the first place. I’m more curious how could THEY get everything so WRONG. All those dumb marks who led to the murder of a million Iraqis should show us their pathetic reasoning; trusting an obvious fool is never defensible.
> My reasoning was that Powell, Bush, Straw, etc, were clearly making false claims and therefore ought to be discounted completely, and that there were actually very few people who knew a bit about Iraq but were not fatally compromised in this manner who were making the WMD claim
At the risk of missing the point, I have to say that knowing what we know now, this is a very poor heuristic. Predicting a lack of WMD was not only correct by mere coincidence, but also irrelevant to the decisions made about the war in Iraq.
What is this blog post even saying? When you can't distinguish a lie, trust the room vibes? Seeking comfort won't give you any answers or get you closer to the truth.
Not enough people ask "why". They instead argue about effectiveness or correctness. At some point you have to determine whether you're chasing the truth to make a decision or just for its own sake. In the vast majority of cases what you want is a decision that will produce the desired results. That's the real reason why lies happen and why knowing the truth doesn't get you anywhere and often nobody cares.
EDIT: for the sanity of any late replies. My bad. I replaced the part about AI with something I thought was more interesting.
It pretty clearly says, "Do not give liars the benefit of the doubt with respect to their current claims." If you want to believe there are WMDs in Iraq, do it because you have evidence, or at least the word of trustworthy people. Don't assume that there has to be a little fig leaf WMD in Iraq because the Emperor wouldn't really go out in public naked.
Was it immaterial to the fact that we were going to war, regardless of the effectiveness of the "sell"? Yes, that's true, but it gives a lot of cover to the Bush administration that so many people, including 110 Democratic congressmen, voted for the authorization to use military force.
Why is it being re-posted now? Who knows... AI, Iran, whatever.
> Right now, we have a similar situation with AI. Not enough people are asking why AI is being pushed so hard. Instead they pointlessly bicker about its effectiveness.
We know why it's being pushed so hard - people need a return on all that money being burnt.
It's effectiveness is argued about because it's not clear one way or the other where things are, where they are heading, and where they will end up.
There has been a strong push for AI/AGI since before computing, so every time there's a breakthrough to the next level there's a hypewagon doing the rounds, followed by a "oh, actually it's not there yet" - and this time, like every other time, we go through a "is this the time? It's so tantalisingly close"
Are we actually there now? Emphatically no.
Are we at a point where it's usable and improving our lives - yes, with a PILE of caveats.
Edit: I wanted to add
There's always "True believers" whenever there is a fork in the road, and con artists looking to take advantage of them, but that happens whether there is a genuine breakthrough, or not - the hype is never a guide on whether the breakthrough exists OR not, so purely being a sceptic isn't worthwhile (IMO)
And relevant to the title quote: maybe it should be amended to “good ideas do not need a lot of lies to gain public acceptance eventually”. The dynamic here is that a significant part of public opinion is simply “well, this is how things work now, and it seems to be working”, and any new and innovative idea by definition is not going to be how things work now. The lies are needed to spur action and disturb the equilibrium of today. But if you’re still telling lies a few years in, you’ve failed and it’s a bad idea to begin with.
(Google tells me this is a relevant summary of US GAAP https://carta.com/uk/en/learn/startups/equity-management/asc... )
Although I've noticed that options have been replaced more and more these days with RSU's (plain old grants) because options have a tendency to go "underwater", suggesting that they weren't all that great to begin with.
This shouldn't make as much of a difference as it does, but due to how our legal system works, it's much harder to get meaningful legal satisfaction when an algorithm (or other inhuman distributed system) commits a crime against a person than when a person does so.
Put it this way, if Hitler had grok, would it really get any worse for the Jews? I don't think so. I think they would be screwed no matter what.
To try and justify their outrageous capital spending on data centers; they are incentivised to exaggerate its current capabilities and also what it will be capable of 'soon'.
There is no time to evaluate each step to make sure it is accurate and going in the right direction, before setting it loose on the public.
That implies that it is ridiculously easy to be right when everyone else is wrong. People aren't trying to be right. Any sort of principle-based analysis easily outperforms the herd. When leaders in society start lying that is indeed one of those situations. Pretty much any situation where everyone knows something and the hard statistics are telling a different story is.
The more pressing problem is how to go from a lovable Cassandra to someone who can preempt major events and convince the herd to not hurt itself in its confusion. Coincidentally that is how markets work, people who have a habit of being right are given full powers to overrule the mob and just do what they want. Markets don't care if everyone believes something. They care if people who got the calls right last time believe something.
In this case, the US hasn't seen a good outcome to a war since something like WWII and even there they waited until the war was mostly over and the major participants in the European theatres were exhausted before getting involved. The record is pretty bad. Iraq was an easy call to anyone who cares about making accurate predictions.
I think this true but misleading — conditioned on other people going with the herd in the wrong direction, it is easy to be right. However, often the herd is going in a right (or at least acceptable) direction. The continual effort to check if the herd is going in the right direction _is not_ easy. If a magic eight ball could alert you “hey the herd is wrong right now, take a closer look”, that would be great! But we have no such magic eight ball.
p.s. If history teaches something - it’s that public acceptance is something sometimes manufactured. On the Internet nobody knows you are a dog ;-) (queue the actual humans responding to my post... 3.... 2... 1...)
That is in fact because coal energy is a terrible idea. It has 0 upsides compared to renewable alternatives, and is on the whole worse than even other non-renewable alternatives.
If you have to lie to make it sound good, that's probably because it isn't actually good
"Don’t worry about people stealing your ideas. If your ideas are any good, you’ll have to ram them down people’s throats." -- Howard Aiken
...to mean that, usually, the good ideas are the crazy sounding ones...
If its a good idea that's obvious, it's already used widely. If its not obvious, you'll still have to convince people. None of that requires lots of lies, though.
At the risk of missing the point, I have to say that knowing what we know now, this is a very poor heuristic. Predicting a lack of WMD was not only correct by mere coincidence, but also irrelevant to the decisions made about the war in Iraq.
What is this blog post even saying? When you can't distinguish a lie, trust the room vibes? Seeking comfort won't give you any answers or get you closer to the truth.
Not enough people ask "why". They instead argue about effectiveness or correctness. At some point you have to determine whether you're chasing the truth to make a decision or just for its own sake. In the vast majority of cases what you want is a decision that will produce the desired results. That's the real reason why lies happen and why knowing the truth doesn't get you anywhere and often nobody cares.
EDIT: for the sanity of any late replies. My bad. I replaced the part about AI with something I thought was more interesting.
This was the stated purpose of the war! If Bush and Blair had said "there are no WMD in Iraq", the war would not have happened.
Was it immaterial to the fact that we were going to war, regardless of the effectiveness of the "sell"? Yes, that's true, but it gives a lot of cover to the Bush administration that so many people, including 110 Democratic congressmen, voted for the authorization to use military force.
Why is it being re-posted now? Who knows... AI, Iran, whatever.
We know why it's being pushed so hard - people need a return on all that money being burnt.
It's effectiveness is argued about because it's not clear one way or the other where things are, where they are heading, and where they will end up.
There has been a strong push for AI/AGI since before computing, so every time there's a breakthrough to the next level there's a hypewagon doing the rounds, followed by a "oh, actually it's not there yet" - and this time, like every other time, we go through a "is this the time? It's so tantalisingly close"
Are we actually there now? Emphatically no.
Are we at a point where it's usable and improving our lives - yes, with a PILE of caveats.
Edit: I wanted to add
There's always "True believers" whenever there is a fork in the road, and con artists looking to take advantage of them, but that happens whether there is a genuine breakthrough, or not - the hype is never a guide on whether the breakthrough exists OR not, so purely being a sceptic isn't worthwhile (IMO)