MFogleman 6 years ago

  The three most powerful words for building credibility are "I don't know"
I had a supervisor once who always had an answer to your question. He was always 100% sure he was right. He was not always right. Consequently, going to him for help would sometimes result in the problem becoming worse.

The best supervisor I ever had was the one who routinely told me "I don't know, let me call {PersonInOtherDepartment}" or "See if the answer is in {RelevantManual} under {PossibleSection}. If you don't find it, give me a call."

We treat "I don't know" as an admission of failure far too often. It should be seen as an entry point to improvement.

  • csours 6 years ago

    I had 2 team leads at one point - One of them would ALWAYS answer your question, and the other would sometimes say "I don't know/I have to research that/Let's look it up"

    They both had the right answer about 95% of the time, but the one who said "I don't know" was only wrong about 1% of the time.

  • KKKKkkkk1 6 years ago

    In my experience, saying "I don't know" outside of an academic environment leads to two consequences: 1) People will stop listening to you in preference to loud people who "know"; 2) Loud people who know will use your admission against you.

    Sadly, the good habits that you learn in academia will come back to haunt you when you move into industry.

    • schizoidboy 6 years ago

      I doubt there are statistics on such things, so we'll have to deal in anecdotes, and I've had the total opposite experience (I make this comment so that any young people or academics aren't prematurely jaded). When a lot is at stake (e.g. high revenue, medicine, infrastructure, etc.), especially when a team has been burned by loud people, a person who says, "I don't know, but let's figure it out" (that's my job) is prized.

      • dionidium 6 years ago

        This is also my experience. I've heard the claims in this thread (i.e. that people who pretend to know things get ahead, while those who admit that they don't know stuff are never trusted) my entire life, but I've always thought, "who are these people talking about?"

        Whenever I've worked with a blowhard who couldn't admit when they didn't know something literally everybody smart thought of them as a blowhard who couldn't admit when they didn't know something.

        • type0 6 years ago

          > literally everybody smart ...

          Too bad this excludes so many managers, but really this is so much a company culture issue than anything. So to cite from the article: "very few companies are capable of making significant changes in their culture or business model, so it is good for companies eventually to go out of business, thereby opening space for better companies in the future."

    • msangi 6 years ago

      The key is not saying "I don't know" and end the conversation. You should give advices on where to find the information needed and who is in a best position to answer.

      I've done it plenty of times and my colleagues keep coming back to me when they need help.

    • sorokod 6 years ago

      This may be true, but doesn't make the loud person credible. Not even in the eyes of the people who are happy to listen to such person.

    • ssss11 6 years ago

      In the corporate world: completely agree.

  • emmelaich 6 years ago

    I'm always ready to say I don't know. But just to advocate for the opposite -- it is perfectly OK to have an clear opinion, weakly held.

    So, say what you think but at the same time ask (demand?) an argument or discussion.

    A friend has a T-shirt with the words: Always sure, often wrong.

    Also see: https://www.psychologytoday.com/us/blog/work-matters/201002/...

  • sjs382 6 years ago

    "Often wrong but never in doubt."

wenc 6 years ago

Side note: Ousterhout was famous for being the author of Tcl/Tk, which was a popular language and GUI toolkit in the early days of Linux (before Qt and GTK came along).

I wouldn't be surprised if many older Unixes are still running Tcl/Tk apps.

  • patrickg_zill 6 years ago

    AOL's web properties were heavily dependent on Naviserver with its embedded Tcl interpreter, which they bought cheap and turned into AOLserver.

    The Vienna based university, TU-Wien, still runs AOLserver and serves 40K users with it. OpenACS.org and dotLRN.org still use the same Tcl-based webserver today.

    Gustaf Neumann shepherded the development of the use of OpenACS/dotLRN at TU-Wien, I think: http://nm.wu.ac.at/nm/neumann

    • gnachman 6 years ago

      AOLserver was my first job out of college. Glad someone remembers it!

      • patrickg_zill 6 years ago

        I have customers that are still using it!

  • needlepont 6 years ago

    TCL is a great systems programming language. When I need a glue language for an imperative model program it's my first choice.

    • AceJohnny2 6 years ago

      What Lua is doing today, TCL was doing 20 years ago.

      Though I believe Lua is a smaller binary, which matters in some embedded scenarios. According to [1], the core Lua interpreter is 40kB with additional base libraries of 22kB, so total of 62kB.

      There are a variety of "small TCL" implementations [2], and one of them, TinyTCL claims to be <60kB, excluding C library functions. I can say that for my embedded requirements, Lua would win...

      [1] https://www.lua.org/notes/ltn002.html

      [2] https://wiki.tcl.tk/1363

      [3] http://tinytcl.sourceforge.net/

      • needlepont 6 years ago

        Interesting. I tried Lua about 6 years ago and found myself unable to see an upside to the language in comparative perspective with python and tcl.

        This is for full featured *nix systems programming in the HPC world so ymmv.

        • AceJohnny2 6 years ago

          Lua's niche is absolutely as an embedded extension language for C/C++ programs. It's core benefits is how tiny and yet expressive it is (and it looks like C if you squint.) I've seen it show up in a variety of projects in that role, for games or embedded testing environments.

          I indeed don't see it having any benefit over Python or TCL in more sophisticated environments.

          • abenedic 6 years ago

            The programming model for c ffi that lua uses is, I think, a little nicer.

        • srean 6 years ago

          I have a soft corner for Tcl. Tcl was a pioneer. Lua is smaller, faster and more tightly and coherently designed. Lua had coroutines right from the start. With Lua you have the option of using LuaJIT. I enjoy both, but I would be more wary of using Tcl in critical systems. But when its play time, i happily break out tclsh.

          • needlepont 6 years ago

            I've used TCL in production for 15 years. The only thing to really worry about is a consistent policy on language usage. Keeping to the base features of the language and extensions in tcllib/tclx is a pretty solid recipe.

            That said here are things I advise.

            * No upvar/uplevel. Use a namespace. * Use apply wisely. * No ad-hoc (say swig inline wrappers) extensions without deep review. * If you are writing a front end in tk - think twice. * If you are doing OOP with TCL choose wisely.

  • _ph_ 6 years ago

    Tcl/Tk is still used a lot. Tcl is still great for scripting applications and ships with many professional tools. Digital IC design very strongly depends on Tcl. Afterall, it was called "Tool control language" to script the design tools Ousterhout was developing.

    Tk is still the quickest way to create a simple UI which is also highly portable across operation systems. With tclkit you can even create self contained Tcl/Tk applications on the big platforms (Win/Mac/Linux).

  • sebcat 6 years ago

    > I wouldn't be surprised if many older Unixes are still running Tcl/Tk apps.

    Or newer. e.g., I use gitk daily at work.

  • rabidrat 6 years ago

    The main configuration language of F5 BIGIP (load balancers) is TCL.

weinzierl 6 years ago

My favorite saying form John Ousterhout:

> What's Wrong With Threads?

> Too hard for most programmers to use.

> Even for experts, development is painful.

I read the slides for his presentation "Why Threads Are A Bad Idea (for most purposes)" [1] in the late 90s and it saved me from a lot of confusion in the following decades. This was especially helpful in a time when Java peaked in popularity and people were like: "Hey, threads are cheap and easy now, what's the problem when our solution uses thousands of them?".

[1] https://web.stanford.edu/~ouster/cgi-bin/papers/threads.pdf

tejinderss 6 years ago

His book a philosophy of software design is an excellent read.

  • solidist 6 years ago

    Yes. So much so that I wrote a short on it stacking it up there with other timeless books. https://hackernoon.com/meta-skills-of-a-software-engineer-be...

    For those who unfamiliar, the book is based on evidence observations of student groups builindg large systems and then reviewing and discussing complexity sources -- then they swap their systems with other groups. They then need to keep iterating... usually hitting problems that we find in the wild. The book is the result of his observations.

    The thing I would recommend John do next in his Stanford course is to encourage the students to infuse testing apporaches to the systems they build. This could reveal even more value of the craft in a setting that is somewhat controlled.

  • tenaciousDaniel 6 years ago

    My co-worker recommended this book to me a few hours ago and I had never heard of the guy. I just listened to a talk of his at Google and he seems super smart.

asdfman123 6 years ago

I took a class from John Ousterhout. Great guy.

However, file this under "too honest":

> For example, a few years ago I started working on my first large Web application

I've spent most of my career learning the do's and don'ts of building large web applications... I assumed Professor Ousterhout had more experience in this sort of thing than I do!

  • brlewis 6 years ago

    Sounds like excellent timing to me. I've been building web apps since 1994. If he wrote this in 2017, then tools for building large web applications had just started getting good.

l33tbro 6 years ago

>The three most powerful words for building credibility are "I don't know"

I agree with this until you come across colleagues with the Jobsian 'reality distortion field'. Or even the social dynamics at play with the orange man that lives in the big white house.

We absolutely assign credibility to people with integrity and self-awareness. But, unfortunately, we also seem to have a capacity to be charisma vultures and will happily believe someone's bullshit if they are making us feel good in the present moment.

  • Normal_gaussian 6 years ago

    One of the ways to deal with this is to provide a positive assertion that includes the asker - "we shall need to work it out". "I don't know" is negative, reflects solely on yourself, and provides no path forwards.

type0 6 years ago

The take home message here should be:

> There are 2 kinds of software in the world: software that starts out crappy and eventually becomes great, and software that starts out crappy and stays that way.

DubiousPusher 6 years ago

>> The most important component of evolution is death

I appreciate the concept. And this has the fun of sounding morbid. But this is a bit like saying the most imoprant part of an ICE is a piston which is silly because a piston needs a confined space and a good fuel to operate. It doesn't work without all three (and more).

In this case a mechanism of genetic change is equally (or more important in a way) to evolution than death. Death is obviously very important though.

KasianFranks 6 years ago

John is the reason why I use Tcl to this day. Good times back at Sun Labs.

naveen99 6 years ago

> The greatest performance improvement of all is when a system goes from not-working to working

Except when speed is part of the definition of working. For example deep learning. The correct implementation was not enough until the hardware was fast enough.

  • mmt 6 years ago

    > The correct implementation was not enough until the hardware was fast enough.

    This doesn't refute his argument, which I read as being, essentially, against performance optimization of software during initial construction. Knuth famously bemoaned premature optimizations, as well.

  • dwaltrip 6 years ago

    Speed isn't in the definition of "working" for deep learning. Speed is an implementation detail. Perhaps a very critical one, but still an implementation detail. One can evaluate the success of a deep learning program (e.g. "is it working?") without knowing how much processing power it used.

h4b4n3r0 6 years ago

>> If you don't know what the problem was, you haven't fixed it

There’s a more pithy variant of this, which I use all the time: “You can’t fix what you don’t understand”.

andrepd 6 years ago

>Programmers tend to worry too much and too soon about performance. Many college-level Computer Science classes focus on fancy algorithms to improve performance, but in real life performance rarely matters.

Stopped reading there. A great plague of modern software development is a complete disregard for performance or resource conservation.

  • ploxiln 6 years ago

    No need to stop there - it's just historical, very old advice, the rest is pretty good. It underestimates how aggressively inefficient today's trends and frameworks are. It actually advocates for focusing first on simplicity. And I think Ousterhout would not consider an "easy-to-use" "developer-friendly" but big and featureful framework to be that kind of simplicity. He says:

    > "faster" algorithms often have larger constant factors, meaning they are slower at small scale and only become more efficient at large scale

    So what I think he's worried about is sophisticated large-scale-oriented algorithms, but that's really not the problem today, which is huge frameworks and middleware and cluster applications, and the desire to use the very latest of these things. More than once I've heard a frontend dev say they want to use "react and redux" for the next project, without knowing what redux is. I've also seen many backend teams decide they need to use Cassandra for scalability and availability, but the cluster ends up being overwhelmed or down for half a day, every month or so, because you need to be a JVM and Cassandra expert to keep those clusters working. I think this kind of thing is exactly what Ousterhout would warn against.

    The zeitgeist has swung wildly in the other direction, developers are no longer eager to implement the most complicated algorithm for theoretical optimization, now they are eager to implement the most complicated frameworks and services for "best practices and to be more maintainable" than the last mess they made. But the recommendation is the same in both cases: make it simple. Study the problem, clean up the mess (hard work!), keep it simple.

    • mmt 6 years ago

      > use Cassandra for scalability and availability, but the cluster ends up being overwhelmed or down for half a day, every month or so, because you need to be a JVM and Cassandra expert to keep those clusters working.

      Phew.. it's not just me. My one exposure to Cassandra [1] in a production environment mirrors this experience very closely. I felt a bit like the child in the tale looking at the nude Emperor.

      Each node held only a single SSD, so increasing database size meant buying an additional multi-thousand-dollar server. Upgrading RAM meant upgrading all the nodes, which became an O(number of nodes) problem in terms of time and cost.

      The backend team may have had a JVM expert or two, but obviously no Cassandra (nor database in general, and certainly not hardware) knowledge. It was the Ops team who had to keep it limping along and gain the expertise.

      It's just another case of "it works for FAANG, so it'll work for us" thinking, even though "us" is 2-3 OOM smaller and would be better served by the simplicity of a scaled "up" non-distributed system.

      [1] Also plenty of experiences with non-Cassandra distributed (usually "NoSQL") databases. The results have ranged from grossly over-engineered (i.e. high cost) to unreliable and/or high-maintenance. Fortunately, rarely both at the same time.

  • jccooper 6 years ago

    The problem is not this advice, but that people tend to stop reading (or remembering) there. It's misapplied or misunderstood so often that I wonder if it shouldn't be promulgated. Today something like this may be more appropriate:

    "You should pay attention to performance, but your algorithms probably won't be the problem. It's more likely you're doing something stupid or wasteful, probably with the help of your technology stack."

    • andrepd 6 years ago

      >The problem is not this advice, but that people tend to stop reading (or remembering) there. It's misapplied or misunderstood so often that I wonder if it shouldn't be promulgated.

      That's EXACTLY my problem whenever I hear this or similar advice. It's exactly what I think when I hear people (mis)quote Knuth by mindlessly parroting the old adage (optimization is the root of all evil). Too many people interpret this as a licence to waste resources and bloat your program with reckless abandon to an egregious degree, that I too think people should probably not say spread stuff like this.

  • remoteorbust 6 years ago

    In case you didn't know the author has built the programming language that formed the backbone of some of the original multithreaded web servers that reached an incredible scale for their time. He is currently involved in research on "core aware thread management", "nanosecond-scale logging".

    This causes me to interpret statements like "Programmers tend to worry too much and too soon about performance." more charitably. I think he does care about real performance.

  • froderick 6 years ago

    > too much and too soon

    That's the critical part of that statement. It is not important to solve for performance concerns until the code actually works so you can verify you've solved the problem at hand.

  • mlevental 6 years ago

    why do people post this kind of stuff "stopped reading right ....". congrats now we all know you're too arrogant and too impatient to finish a 1000 word essay. don't you realize that dismissing the bulk of the text based on just one comment is ignorant? do you think it matters to anyone what you even think relative to this person? do you think you have some deep insight that others don't (such that they don't already realize that that particular point should be contextualized appropriately). I just don't understand people constantly trying to assert their delusional superiority at every vague opportunity - let me kindly inform you that op posting this wasn't an invitation.

    • JoshCole 6 years ago

      Worth noting that probabilistic reasoning via approximations are actually pretty cool. Its just that the same sort of thing that makes them great is also a thing which makes them terrible. The upside of not being thorough is being fast and the downside in being fast is not being through.

      I find it neat that the person arguing for speed is employing heuristics which short circuit reading.

      A great way to look at human biases is through the lens of the good they cause. It makes them all make sense in a way that looking at them through the lens of failure cases doesn't. The world is awe inspiring in its complexity and coping with that efficiently and in real time requires trade offs. Catch the same person who appears delusional in one real-time context at a time wherein they can think for longer and their thinking can become much more logical and mathematical.

    • andrepd 6 years ago

      It was hyperbole. I did actually read the rest of the points, I just didn't pay any more attention to that one.

      I don't feel any kind of superiority, I was just voicing my opinion. Maybe you're projecting?

  • needlepont 6 years ago

    Performance _or_ resource conservation. Which one is it?

    Application design and domain knowledge are more important than rote optimization. I think that is what Ousterhout is getting at.

  • dcuthbertson 6 years ago

    I agree with you. I've worked on drivers for virtual NICs. Every performance boost was well received by customers