Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What bits of fundamental knowledge are productivity multipliers?
413 points by stardustpie on March 30, 2022 | hide | past | favorite | 419 comments
I recently realized that certain kinds of knowledge allow one to be significantly more productive when solving a large class of problems.

For example,

* Regular expressions for simple text processing.

* Parser combinators for parsing.

* Parser generators (esp. packrat variety) for parsing.

* The concept of fuzzing and property testing for testing code.

* Calculus for solving all sorts of problems.

* MCMC for solving a huge class of probability problems.

* Search algorithms for solving a variety of problems (e.g. all NP-hard problems, sudoku, HTNs, scheduling, planning).

* Gradient descent for solving a variety of optimization problems.

* Vector Space embedding as a conceptual tool for a variety of complex AI problems.

* Effect composition (Haskell's IO or Scala's ZIO) as an incredibly powerful paradigm for concurrency and parallelism.

What are some examples of 10x multipliers that come to your mind? Fundamental ideas without which you would be drastically less productive.




- Writing as form of (or tool for) thinking; Leslie Lamport said (maybe quoting someone) that if you're thinking, but not writing, you only think that you're thinking;

- High tolerance for feeling ignorant, confused, silly, inadequate, a novice: none of these states should phase you: you should not have a comfort zone: let your mind feel at ease in not understanding something: go to the eye of the storm and weather it: you'll come out being more capable;

- Formal specification (maybe TLA+) when doing something unintuitive, like non-trivial concurrency; or, simpler put, think before you do;

- Functional programming, immutability, state machines, reactive programming: whatever you can do to make your systems more declarative and their state easier to reason about;

- I'm a geek for tools, and I know that not everyone is like me, but for me choosing the right stack for the job is a big deal, and when saying stack, I mean every tool I'll be using, from the programming languages, to deployment tech, to testing setup; a good tool can effortlessly solve a host of problems.


> High tolerance for feeling ignorant, confused, silly, inadequate, a novice: none of these states should phase you: you should not have a comfort zone: let your mind feel at ease in not understanding something: go to the eye of the storm and weather it: you'll come out being more capable;

I’d love to be more at ease on uncomfortable situations. Any tips?


I will give you a fairly specific answer that comes from my history of going to therapy for anxiety and depression.

Whenever I find myself in an anxious or uncomfortable situation, the thing that I need to remind myself is that things are going to turn out okay for me, whatever happens. I do this by working my way up a version of Mazlowe's hierarchy:

1. Am I physically safe? Can I breathe? Can I move my body?

2. Do I have access to shelter? Do I know where my next meal will come from?

3. Are my loved ones safe? Do I have friends and people who care about me?

4. Do I have skills that I care about? Can I do work that I'm proud of?

5. Do I have a job that gives me money? That I like to do?

In pretty much any uncomfortable situation you're in (at least in the context of career improvement), almost all of these things are totally unaffected. The sun will rise tomorrow, even if I bomb this programming interview. My friends will still be my friends, even if I do a bad job networking at this event.

And the fact that so many of the good things in my life will still be good, even if an uncomfortable situation goes as badly as it can, gives me a lot more confidence.

And then, like anything else in the world: practice, practice, practice. The more uncomfortable situations you put yourself in, the more you practice reminding your brain that "everything is okay, actually," the less uncomfortable you'll feel in any given situation.


During stressful times, I’d have agoraphobic attacks in the car. They were pretty terrifying. I’d open all the car windows and blast the radio. But you know what worked much better? Solving math problems in my head.


I like this one too!

A big part of learning to deal with anxiety is understanding that an anxiety attack is rooted in a chemical, physiological response that you can't control; sometimes what you need to do is just ride it out, and let your body get back to normal.

Math problems, loud music, singing songs -- great ways to "get out of your brain's way" as my therapist used to put it.


My go-to is trying to visualise and rotate various 3D shapes, like a cube, trihedron, dodecahedron…


This is great, and it made me realize I had started doing something similar recently: when I become anxious about something, say money, I ask myself: are my kids healthy and safe? The answer, so far, has always been yes, and that calms me down immediately.


That's awesome! Glad to help you put a concrete framing to it. The more intentional you make it, the more effective it is, I think.

There's a reason people keep pictures of their family at their desks; it's not just for decoration :D


I am totally with you on this. Catastrophizing when things get difficult makes it so much harder.

I make it a point to count my many blessings (most of which you enumerate there) every day.


Thanks for an EXCELLENT answer.


I'm going to go a different advice route then other commenters here.

If you went to the gym and did a really easy workout that was comfortable.. that would be a bad gym session. The goal is to push yourself. Not to injure yourself but to go until failure.

We don't learn in the comfort zone. We learn and grow by pushing ourselves where we feel like we might fail and then pushing until we fail.

Success isn't a game of avoiding failure. Success requires learning and growing from repeated failure. Failure isn't the opposite of success, it's part of it.

Feeling dumb, not knowing where to go, having difficulty even reading and comprehending material is when we know we are engaged in building ourselves.


There comes a point in your life where this may not still be true. At 70 I doubt I'll be pushing "to failure" in training sessions. In fact, I'll probably consciously reduce the intensity of my training sessions and use them to maintain muscle mass for as long as possible into old(er) age.

The same goes for work. I'm a hard worker, but I'm not gonna bust my balls at some startup so the CEO and his buddies can make a billion. Not unless I'm one of the founders.

Some people have already done a shit ton of learning and growing, and maybe they're at a different stage in their lives. Like the stage where their 30+ years of experience are extremely valuable without grinding 50 hours a week.


You've confused personal growth and "the hustle".

You should, as a person, work to grow and be stronger no matter your age, whether that's squatting 400lbs or doing your water aerobics. Work the correct weight for you.

You should not bust your balls at some startup so someone else can get rich. That isn't helpful. Don't do that, regardless your age.


Busting your balls for someone else's start-up is a rite of passage unfortunately, but as long as you don't burn 10 years of your life before realising that your stock is worthless, you're ok. That mistake shouldn't cost 10 years.


In hindsight, what we don't know is if the people who grinded 50 hours a week would have ended up in a stable situation later in life where they felt confident enough and had the means to take things at their own pace.

Regardless, everything has a cost. If you are in a place where you can spend time at your leisure then that is something to be admired. You likely paid the cost for that (by grinding 50 hours a week) up front.

For the younger crowd here, perhaps they should grind 50 hours a week so they too can build their "fortress of solitude".

But if they can do it at half the cost, that would be ideal wouldnt it?


> that would be a bad gym session. The goal is to push yourself. Not to injure yourself but to go until failure.

Umm. Actually, as far as I know, that's completely false. What you're describing is a bad way to train your body. If you're failing reps, you're training your body to fail. I.e. "one more rep" culture is misguided. You don't want stress or exhaustion hormones in your system. If your goal is enhancing your body, that is. Looking at my notes, my source is Pavel Tsatsouline #1399 episode on Joe Rogan's podcast. If you can get a hold of that podcast, it's very informative.

I won't comment on what you said about the mental aspect of failure.


I don't love the marketing "feel" of this source but it is a good and comprehensive guide on what failure means in a lifting context: https://athleanx.com/articles/should-you-train-to-failure

You don't "train your body to fail". You may be training bad form - and form failure is failure.


That's great if you're in a body building competition, but you're training to have a heart attack at 65.


You sound like me 10 years ago. Be careful in pushing Yourself in the gym until total failure. It might be directly correlated to injury. Especially for Your shoulder, knee joints and lower back. It's tricky to have good joints and be very big and buff all Your life. Muscles grow, joints do not. Doing lower than 8 reps will help You grow, but it will kill Your joints.


Learn a new language! Be fearless when speaking it even when you know you're wrong. The results are humbling often hilarious and get you accustomed to frequent failure. This is also a good approach for people who are socially awkward or want practice just speaking to people. There are literally millions of people interested in doing "language exchange" sessions, where you do x minutes in your native tongue in exchange for x minutes in their native tongue. Not only will you learn a new language, but asking questions about idiom you do not understand is amazing practice for "anything" you don't understand.


This isn't really an answer, but I'd say that everyone's different. What works for you might be wildly different than what works for me.

For what it's worth, I'd say that it's about being at ease with being uneasy. I find it unproductive to try to get rid of uneasiness.


For me it's practicing art.

Mainly playing music with a group, sometimes writing and occasionally drawing or painting. It focuses me on the emotional experience and response to entering the unknown and finding flow there

There's a book called Art and Fear that nails the tao of it.


Seconding this recommendation heartily!

"Art & Fear: Observations on the Perils (and Rewards) of Artmaking"

It's surprisingly relevant to what folks like us go through, and how we relate to our creations, intangible though they might be (at least if you're purely in software).


Read Richard Feynman. The man was a genius and would ask as many simple questions early on. If he can check his ego, all of us should be able.


For me, it's closing my eyes, taking a few deep breaths, and then continuing. It's okay to be nervous and feel ones heart pounding and just notice all those physiological reactions, but keep going despite that. My old tendency was to switch task to something more comfortable, but just take a silent minute without switching tasks and then keep going.


I have a friend who took anti-anxiety meds and was immediately extremely productive.


Agreed, I was on anti-depressants for years.

Their most noticeable effect on me was a dramatic decrease in anxiety, and an accompanying increase in comfort in social contexts and far less catastrophizing about challenging life situations.


> High tolerance for feeling ignorant, confused, silly, inadequate, a novice

What are some tools to do this? When I feel I don’t know something, I feel very anxious, stupid and unhappy. How did you develop the skill to be okay when ignorant?


This is pretty much about learning to deal with your feelings. Different things work for different people. Mindfulness and acceptance have been interesting concepts to explore for me. Though there too exist a variety of approaches. Explore related literature.


Wow, I've never thought of "functional programming, immutability, state machines, reactive programming" as time savers, just better ways of coding stuff, I'm a self-taught engineer and literally all of these just seem like correct way to code stuff beacuse it's much smoother and less faff.


Not technical but the self-knowledge of how to put yourself in a productive state. Knowing how to sleep well, eat well, exercise well to allow yourself to perform at a sustained high-level.

Someone may be more technically competent at a problem-space (at the start), but if you are able to work diligently at the problem over a sustained period, you will have 10x better results than those who crash/burnout/lose interest/try to sprint to the finish.


As someone who struggles with ADHD and depression, this isn’t always possible.

At 38 years old, what I have discovered are ways to make myself valuable to my employer even when I’m not at my peak. The move to a more “architectural” role has been a Godsend for me. It means that I’m a let to serve to some degree in an advisory capacity for others. When I’m not able to enter “flow” with my primary project, I can usually find someone else with a problem that they’re struggling on, pair up with them, and help them get over whatever is in their way. This makes the team as a whole significantly more productive.

As a bonus, I often find that focusing on a problem of short duration helps me overcome the mental barriers to becoming engaged in my work and lets me get done what I was struggling not to avoid doing in the first place.


ha, lucky. I have a similar constellation of problems and never found an employer willing to tolerate it long term. I can put large volumes of high quality work for 3-5 weeks at a time followed by 2-4 weeks of a low output slump. I usually get fired on the fourth or fifth cycle, freelance for a while, then try again.

Always hoping to find somewhere willing to accept the inconsistency in output but haven't found it yet. Anyways it's cool to hear someone in one of these threads acknowledge this reality for a lot of people. The general HN consensus seems to be that anyone who isn't a top 10% technical star and an incredible productivity machine doesn't deserve to touch code.


What type of work do you do? I run a small consulting company that builds web apps for biotech, and we’re always looking for help. I suffer from similar problems so I’m sympathetic to the situation you described.


> I have a similar constellation of problems and never found an employer willing to tolerate it long term. I can put large volumes of high quality work for 3-5 weeks at a time followed by 2-4 weeks of a low output slump. I usually get fired on the fourth or fifth cycle, freelance for a while, then try again.

This sounds 100% like me.

I currently work for a healthcare company in the mental health space. We’re definitely hiring. Shoot me an email, and we’ll see if we can’t get you into the interview process.

nominallyanonymous@protonmail.com


Larger enterprises should tolerate this better.


Yes, but some small ones do very well.

Those of us who are like this are a “good value” to them - both in terms of relative pay, and because we tend to be VERY loyal when we find a “home.”


Came here to say this. The library of Alexandria fell into disrepair. Try not to let the same thing happen to you, or all that knowledge will be for nothing.


The mental model for this state is known as "flow" or atheletes sometimes call it "the zone"


...lots (most?) of the actual work that needs to be do is "grinding" or "working against tons of hard/impossibly to remove friction": you'll never achieve any kind of flow doing this kind of work!

So the productivity multiplier advice is this: learn to cope and be productive at doing the kinds of work where flow is impossible/unfeasible!

(This way you'll not be already tired and behind when you get to the work for which "flow" is possible...)


You can gamify anything. Gamify makes the toil of grinding fun.

Flow happens during fun.

Sometimes there are shortcuts to grinding that can only be discovered during "flow" (usually for me, it's to identify better tools)


If you deal with computationally hard problems, I think there's a whole toolbox of utilities with unbelievable performance, in which millions of PhD-holder-hours have been poured, but that get ignored more often than not.

- SAT modeling languages & SAT solvers

- same for SMT (satisfiability modulo theories)

- Constraint programming

- MIP (mixed-integer programming) and its special cases, like

- TSP (traveling salesman problem)

The latter is maybe more anecdotical (as in: fewer direct applications), but it is emblematic of the whole phenomenon. We see a steady stream of claims that NP-hard problems are "impossible" or take "billions of years to solve". We also see blog posts with the latest ML approach "doing the impossible" and providing reasonable (not optimal) solutions to 50-city TSPs. All this while ignoring that with proper maths, we could solve 50-city TSPs to optimality by hand in the fifties, and now Concorde routinely solves 100k-city instances in seconds on an iPhone.


Mathematical programming (LP, (M)IP, etc.) feel like superpowers to me. Working with them also helped me realize that many optimization problems are actually just closely-related variations of the same problem.


Maybe because Linear Programming is P-Complete.


I could certainly read more about this, it sounds fascinating. Will appreciate more of your knowledge or a thoughtful hyper link.



I’m aware what the TSP entails and have written algorithms for it myself, but the mathematical approach to programming, so explicitly, is new to me.


The ability to type 30+ WPM. This is a pretty low bar to cross.

About half the people I have worked with when faced with the task of writing a 40 character line of code will do something like find a similar line and copy and paste it, and then change a character or 2, and then go find a 6 character variable that they need to type and instead of just typing the 6 characters, they highlight another copy of the variable with the mouse and then copy and paste the 6 characters. The whole thing is really frustrating to watch when you're pair programming with them. I'd be surprised if they're getting more than 10 WPM through all of this. It's got to be a huge mental burden for them as well; I mean, getting a few characters on the screen shouldn't be a mental exercise, you shouldn't be thinking about where you can find a similar line you can copy/paste, and you shouldn't be thinking about where else you can copy/paste variable names from. Getting a line of code onto the screen shouldn't be a creative endeavor.

I know being able to type 100 WPM isn't a great advantage when programming, but a minimum of typing skills, a mere 20 or 30 WPM, is important.


This honestly blows my mind. It's very difficult for me to imagine a programmer who is bad at typing. Do you live somewhere where personal devices are uncommon? They made us learn to touch-type in high school (when I was 15), and it made it so much easier to interact naturally on AIM. It was the most immediately-useful course I'd ever had.


http://steve-yegge.blogspot.com/2008/09/programmings-dirties...

"Here's the industry's dirty secret: Programmers who don't touch-type fit a profile. ... Here's the deal: everyone is laughing at you. Or if they're your close friend, they're just pitying you. Because you suck."


copy and paste can be a good tool to avoid typos. But I see alot of people who don't know basic keyboard shortcuts or word autocompletion with tab...


My dad is annoyingly like this. I'm so grateful that I learned vim shortly after he taught me how to code.


I roughly divide these skills into three tiers:

* Micro-optimizations: e.g. a shortcut that saves you a few seconds several times a day.

* 'Mid'-optimizations: e.g. writing an script that saves you an hour once a week.

* Macro-optimizations: e.g. pushing back against building a useless feature that saves you several months in a year.

Most developers focus in the micro level, but I believe 10x productivity lives in the macro level. Also the macro level doesn't need as much tech skills as it does soft skills and domain knowledge.

As an exhibit, in my career I've seen two projects that

a) never launched

b) took several people several months to build

c) would have never been built had someone pushed back and asked the right questions at the start.


I am not a great dev but having worked long enough in the industry I know where to push back on the macro level and that allows me to build things with incredible speed.

So much effort is spent on building the wrong things for the wrong reasons.


I use these tips as a civil engineer (who knows how to program):

- fully specify the problem before jumping in. Write what you know, the context and the unknowns.

- Write the steps of the solution before jumping into a problem.

- If writing a report, create the outline/heading structure before doing any other writing.

- If you want to do something non-trivial, quickly search the web to see if someone has written a library or has a one-liner to save you time.

- 80% of your results will come from the first 20% of the total time you spend working on something. Anything after that is polish. This helped me to overcome perfectionism.

- If you don’t know something, ask a colleague or peer. They will be thankful for the opportunity to teach you something.

- If you’re ‘stuck’ on something, completely switch context for an hour or even a day. The subconscious mind does serious heavy lifting, it just needs time (and rest!).

- if you’re learning a new concept and struggling, don’t write detailed notes straight away. Your notes at this stage reflect a naïve understanding. Letting an idea ‘ferment’ before writing about it will result in primo notes.

- Don’t spend too much time buggering around with tools and tech. If you’re spending more than 30% of the total task time on ‘ceremony’ or ‘config’, it’s a hindrance.

There’s more but these are the main ones.


You sound like a pleasant coworker


Using "primo" and "buggering" like that: I think I spotted a Kiwi?


Close! Aussie. I thought primo was a yankee word though.


Most of these aren't really productivity multipliers, they just enable you to do the thing you need to do.

It's unlikely that someone is doing task A without any calculus but another person who uses calculus is 10x more productive on the same task.


I think it's also possible that some of these lead to a I've-got-my-hammer-and-everything-is-a-nail approach to problem solving. You may then find yourself parsing HTML with regexps (https://web.archive.org/web/20111009133402/http://stackoverf...), which is never a good idea.


I'm a big fan of `for (x of y)`, skip the functional version, manipulate variables in and out of scope, hell, use globals. Sometimes you show up wearing a tux, other times you just have to rock the party in a hoodie.


This is called "law of the instrument"


You may want to NOT underestimate one's ability to add TONS of if/elseif/else (or switch/case) statemens in order to put things on track...

There are various cases of „programming hoorrors” stories where you have a calculator implementation that, instead of doing the ... you know, math, will actually go through all X combinations.


True, as I wrote the calculus example I remembered this post about a medical researcher rediscovering integration - https://news.ycombinator.com/item?id=26384357

But it still feels more like unknown unknowns causing you to put in a lot of extra effort; I wouldn't classify these as productivity multipliers.


Back in the day, I saw someone ripping a CD to MP3s. He lazily named the first one 1.mp3, the seconde one 2.mp3 etc. After 9, he found he ran out of numbers. So the next one became ... A1.mp3 !

I always wondered, if he thought this trough a little bit further, he'd independently reinvent the arabic number systems.


Some interfaces will sort 10.mp3 so it's directly under 1.mp3, instead of under 9.mp3. This could be a totally sensible hack to evade that problem.


It even has a name: natural sort order.

https://en.wikipedia.org/wiki/Natural_sort_order


That's actually not so bad. Physicists in the 20's reinvented matrix math.


Interesting? What would be a good reference on this?


I most recently encountered this fact in The Man from the Future, a recent biography about John von Neumann. (The first half of it, the first 5 or 6 chapters were good, but it goes off the rails after that.)

I think by "physicists" I should have said, Heisenberg. I'm talking about this: https://en.wikipedia.org/wiki/Matrix_mechanics#Heisenberg's_... https://en.wikipedia.org/wiki/Heisenberg's_entryway_to_matri...


I just had a recent situation like this which initially presented as a simple calendar / due date list, but then spiraled into some dreaded fuzzy set of if/then cases as the client began to add requirements for due dates of certain items in a certain order that would take priority over other items due on the same date, or missed items, or future items that could be done in advance.

After two weeks of increasing horror, and four attempts to write the algorithm, I sent the client something I called the "Blueberry Muffin Problem" email, as a reference to that scene in Casino. But there is no logical way to do this, I said. Anyway. This led to a series of conference calls in which we finally were able to see a clear strategy.

End result: 40 lines of immaculately clean code, roughly $10,000 at $200/hr, problem solved.

I obviously wasn't paid for writing the code in this situation; it was for spending enough time with the problem to find all the edge cases, and figure out the right questions to ask to get to the solution they needed.


>figure out the right questions to ask to get to the solution they needed

This is how Jon Bentley starts his classic Programming Pearls

"What are you doing...and why?"[0] is almost always the best place to start when tackling requests from customers

------------

[0] my review/reference in relation to this exact problem, of asking the right question


Yeah, it's kind of meta-coding at a certain point. Business people are great with the logic of making money and keeping customers happy, but they are bad at translating those operational requirements into pure logical processes. Particularly if they expect their employees or customers to use software that they expect to guide them to do things in a set order, they aren't good at thinking about the order from the software user's perspective. So just putting yourself in the shoes of both those people is really the job. It's understanding the desire of the business and seeing where that's going to run up against problems in the real world.


Can you share or summarize the blueberry muffin email?


Uh, if you really want. This is probably boring as fuck until I get emotional around the third paragraph ;)

---

I'm putting this down here because, even though it seems obvious, it's not; if you spend 10 or 20 hours really thinking seriously about what happens if people write early reports or late reports - the initial conclusion was just count how many they wrote into how many they sent, and tell if they need to send more, but that's not the way life works. Or people work. And it's going to create impossible bottlenecks if the last report is the most important one. The insight I had and what's going to drive this thing is: How many reports were due since the last one you sent. Again, that seems simple but it's not. Technically it doesn't mean they just skip them; it means if one was due Monday and one Wednesday, and you send one on Tuesday, that covers Monday's report; but if you sent it on Wednesday it covers both, the idea being to discard the missed ones without explicitly saying so, and drive them to do what's important today, and more important if it's the final day.

It also covers the situation where they filed Monday's report early, but only if they've filed all the needed reports prior to that; otherwise they'll need another one for Monday.

That's why this is so hard. It's not the code. It's the number of possible scenarios.

So the rule is, we start counting again from when the last report was written, in terms of priority. And when you're looking at 50 reports a day per franchise, you'd better know what the priorities are.

For myself personally, I think this is writing software that will lead to absolute catastrophe. I've always tried to align the way my software works with how I think people will be able to use it, to push them to do the right thing. Anyone can write the software I write, it's the thinking about this process that is extremely difficult if you're dealing with, like, hundreds of [redacted] and thousands of customers spread out over time, and have to figure out how to make reasonable suggestions. This is such a thing.

What I'm saying is that this is marching into creating a thing that will cause total chaos and I don't think this is the right way. We have the chance now to re-imagine the frequency and rearrange the expectations of the customers and the expectations placed on the staff, and I think we need to do that. The algorithm you asked for is done, but it will not be good for people. And it is extremely hard to understand, even for me, if you asked me why one should take priority over another, when dozens are urgent on the same day. I wrote an "urgency" algorithm on top of it to try to deduce that. The best I can say about it is that it will cause less misses and damage than any other way of looking at the list, and it took me a long time to get to. But I think you're asking for something that is going to be so inefficient, and create such high customer expectations, it's going to be negative on both sides.

The scene keeps going through my head from "Casino", where De Niro tells the chef in the hotel to make sure the same number of blueberries are in every muffin, and the chef just drops his hands and goes, "do you have any idea how long that's going to take?" This is a blueberry muffin situation. We need to think of a better way.


I found that fascinating, thanks for sharing!


This is well put. It gets to my ambivalent response to this list. Understanding parsers is important ... if you're parsing things. Etc.


I think the true 10x multiplier is not a technical skill at all. It is the ability to quickly cut through irrelevant tasks and actions to focus on the thing that will move the business goal forward fastest. It is enshrined in the concept of the MVP and the short iteration cycle. Quicky building something that does not solve a business problem is not productive. Work=force*displacement. No movement, no work.

A closely related skill is the ability to cut through all the abstractions and interfaces to get to the root cause of what is broken or needs changed. A technical and organizational debt buster if you will.


Absolutely this. I was a chronic procrastinator prone to intense anxiety and breakdowns just before deadlines. Then I took a really good ~3 hour course on time management, and it seriously changed my life. I can’t even describe how much more productive I am. It’s changed how I approach every task, professional and personal. I can’t even remember the last time I felt any sort of stress or anxiety about my to-do list. A big part of it is figuring out what doesn’t have to be done right now (“positive procrastination”).

Edit to add that the course was “Time Management Fundamentals (2016)” by Dave Crenshaw. I completed it on LinkedIn learning. For those of you who’d like to give it a try, my advice is: do all the exercises. You won’t get the full value out of the course unless you actively incorporate the whole routine into your workflow. There’s a learning curve, but it’s absolutely worth it. Also be warned that the presenter’s personality is a little off-putting (to me, anyway), but he absolutely knows what he’s talking about, so listen to him.


I'd like to second the recommendation for Dave Crenshaw's courses. I never realized before that I wasn't actually managing my time; I was just desperately chasing the non-existent end of my entire life's to-do lists. I'm still working on getting through my initial in-box (the one you create by going through his trigger lists), but all the important stuff got done, and I have confidence that the things I haven't gotten to yet aren't important enough for me to panic about (otherwise I would have done them!).

The "time management tips" one is a kind agglomeration of his advice on managing your time, your relationships, your reports, etc., and is a good follow-up. He has a course on having productive meetings, as well, which is well worth the time. It may make you hate meetings at your work, though XD.

Honestly his courses provided a lot of relief and validation to me, as well as hope for a future working in companies. I can't recommend them enough.


Thanks for the other course recommendations! Given how insanely valuable his time management course was for me, I’m absolutely going to jump on that “productive meetings” course.


What course was this, may I ask?

I might watch it ... later :p


I started going through the course three days ago, right after you posted it. I can already feel improvements in organizing my time.

The course doesn't offer anything novel - for example it has elements of Getting Things Done. But it's presented in a more accessible way that resonates with me. It also focuses on making good habits, something I already discovered in another book recommended on HN, Tiny Habits.

Thanks for the recommendation. I really appreciate it.


I'm also interested in what you learned and which class you took!


Could you share any tips or the name of the course?


Sorry about that—edited the original comment to include more info :)


Can you share this resource?


Sorry about that—edited the original comment to include more info :)


Curious what course you took and if you have time, a summary of the most valuable skills you learned?


This. Which isn't to say that technical knowledge doesn't matter, it truly does. But, the primary problem in the industry right now is that the majority of finished code generated by professionals goes unused. The amount of waste is staggering. Working on the right thing is way more than a 10x multiplier.


Yes! I often tell people this in interviews and they act like I don’t have enough technical knowledge. The cult of Leetcode has made this so much worse.

You rarely, if ever, need Leetcode. What you need is to understand deeply what you should be building so you don’t waste everyone’s time!


Part of the issue is that they often don't want you to decide what to build. They've separated out the "building" from the "deciding what to build", so hearing from a candidate that it's important to focus on what to do is off-putting to an organization that just wants you to munch through JIRA tickets all day and find technical solutions to whatever somebody else wants, and then keep a smile on as they change their mind and discard the feature.


100%. Legacy company management finds software engineers who engage in the very relevant question of "what to build" extremely threatening. In my experience they actually treat software engineers who behave that way as disloyal.

As compared to places like tesla/shopify/etc where the Steve Jobs quote rules the roost:

    It doesn't make sense to hire smart people and tell them what to do. We hire smart people so they can tell us what to do.


And it's not only about features that go unused, but also features that are used but only in support of workflows that improves nobody's life. Or functionality that's used but is a wildly inefficient version of what could have been.


I’m curious about what you’d consider concrete examples of these inefficient solutions? The biggest thing that comes to my mind are SPAs that don’t need to be SPAs.


One example from my employment history was a customer that kept asking for (and unfortunately getting) one special reporting feature after another in a data processing system.

After a couple of years and several such reporting options someone asked, "Can't we just generate a spreadsheet and then your business analysts can do whatever reports they want from that?" and the customer responded, "You can do that?! That would be so much easier for us."

Imagine the weeks or even months of human life (both implementing but also using the less efficient reports) that could have been saved if the right solution had been implemented from the start.

That's literally time taken out of someone's life that they could have used to be with family or whatever they wanted instead of transcribing ridiculous reports.

(Of course, those reports were probably only generated to be looked at once in some recurring management meeting, only for the executives to feel good about having seen data. I don't think the reports were ever used to support any decision.

This means the entire reporting theatre was completely bullshit and the customer could have paid the business analysts to sip drinks on the beach instead with no loss of productivity. But that's just my cynical view.)


> That's literally time taken out of someone's life that they could have used to be with family or whatever they wanted instead of transcribing ridiculous reports.

Only if the product is run by one or two people who give each other that much time off. *Much* more likely, this work not commissioned by the client, leaving space for other work to be done.

This "automation will lead to more free time" is a utter utter myth, because those with the power to pay staff will just sell their staff's time+skills to someone else/some other problem.


You're right, of course. In practise, the organisation would likely make up some other bullshit work to fill up the time, because God forbid people get to enjoy life!

(There's a tiny chance the void would be filled with useful work, and that's also a good outcome in my book.)


> The biggest thing that comes to my mind are SPAs that don’t need to be SPAs.

These are unlikely to be technical issues, and more people/process issues. SPAs are popular because they optimsie for developer productivity (so arguably the opposite). Others have talked about them, but examples like focusing on the wrong problems to solve (if you're a startup making a video game, do you really need to build your own chat app? [0])

[0] https://nira.com/slack-history/


> The amount of waste is staggering. Working on the right thing is way more than a 10x multiplier.

How much of that waste would you attribute to 1x programmers and how much would you attribute to 0x managers?


85% in favor of managers being the problem. I pulled that number from the Ford Motor Company turnaround where Deming said management was 85% of the problem.

And I mean, who is responsible for curating those 1x engineers in the first place?


I would attribute most of it to poor customer development or product management. If you can’t find the demand for your product, it doesn’t matter how good the engineers or managers are.

Failed projects are inevitable though, even if you do everything well. Good teams don’t look for people to blame, which results in no one sticking their neck out to advocate for stuff. Instead they look to learn from each experience and hone their processes.


> the majority of finished code generated by professionals goes unused

What are you referring to?


I'm referring to how companies ignore the thorny problem of what to build and instead focus on getting "Something done", which results in delivery of product increment that users at best say "meh" to, and don't use.

As an aside, from a developers perspective, all of the work is clearly valuable, I mean they got paid 300k+. But from a true definition of value perspective, the change increment is just not valuable to the users as evidenced by the lack of use.

    Value:  Reliable fulfillment of needs


> from a developers perspective, all of the work is clearly valuable, I mean they got paid 300k+

Midwestern US salary caps are more like $90k-$150k.


Amen. or as Peter Drucker puts it (he is dubbed as "the founder of modern management" on Wikipedia) :

"There is nothing so useless as doing efficiently that which should not be done at all."

https://en.wikipedia.org/wiki/Peter_Drucker


And likewise, this theme shows up in Tom DeMarco's classic book Slack, which contrasts "efficiency" (the rate at which an organization is moveing towards some goal) vs "effectiveness" (the ability of the organization to choose and steer towards better goals). An important theme of the book is that an organization running full-tilt (maximum "efficiency") intrinsically reduces/eliminates its needed human "slack" to reflect and iterate towards the correct goals. DeMarco also digs into into the many organizational and management anti-patterns, with supporting research, that harm both effectiveness and efficiency (and just plain human well being...)


“Efficiency is doing things right; effectiveness is doing the right things.”

Damned, I had to search for the author and... it's also from Peter Drucker !


"If you don't know where you are going, it doesn't matter how fast you get there."


nice one as well !


This is the right answer.

The proof is that a good coder spends about 10x as long thinking about it before actually committing any code. In a sentence: Be able to visualize the problem in your head first. The actual code, if it's elegant and succinct, is incidental.


> The proof is that a good coder spends about 10x as long thinking about it before actually committing any code.

This seems antithetical to the post above yours. Or at least how I'm reading it. In bigger tech co's this looks a lot like massive design documents with a dozen reviewers, all giving enormous amounts of time and thought to the idea in the pursuit of "building it right". Design processes that can take weeks or months for what is otherwise an MVP.


I would say analysis paralysis: no. Strategic thinking/designing: yes.


Take the time to distill it down to a powerful but small concept.


I've been thinking about this idea a lot recently. I'm decently new to the big-tech-co space, but it does seem like a huge amount of resourcing goes into design and defense of projects. I wonder what y'all think about someone doing a timeboxed MVP (a sprint or less?) as a way of susing out design, and THEN getting reviewers/etc?

Obviously you have to take security into large consideration when doing things like this - but this is what qa/sandbox environments are for.


Hell, wait till you find out how much time is wasted in Dev/Ops just changing the tooling every year or two, literally tearing down and rebuilding everything you already have in the name of staving off another couple years of technical debt. It's silly.

I'm a bit of a rarity in the modern world (and here in the corporate space that is HN), in that I'm the sole programmer on a bunch of large software projects in production that I also maintain and sysadmin 24/7. The only meetings needed are to understand what the clients want and bitch at them that what they want makes no sense, until they start making sense, and then trust me to build it. If something comes up from a design perspective that I think could go one of several ways, I usually have that thought in the shower and mumble to myself for awhile before framing it in terms they would understand as a series of yes/no choices.

This is generally way more efficient than hiring a team, and I know, because I've tried to hire teams to do it. There's a limit to what one coder like me can do, but it's a lot higher than what 4 people bickering can do. I'd say it's around what 8 bickering people can do.

Design and defense of design, though, is not just about ego. Not if it's done properly. The best design/code people will come to you and say this is why this is the only way to do X and lay out the chain of logic that led them to that conclusion. That's not because they'll be personally hurt if you don't go that direction; it's that they're annoyed they'll have to do extra work to make an inferior product if you don't take their advice. Promote those people.


> I'm a bit of a rarity in the modern world

I'll say. It does not sound like you'd be very fun to work with, as a teammate or a stakeholder.


What happens if you get hit by a bus or just want to take a vacation? I think this is more about your inability to work with other people than it is about you truly being a 4-8x coder.


Well, thanks for the confidence; you did hit the nail on the head. I haven't taken a real vacation in 20 years, although I did live in vans and hostels for 10 of those years. For peace of mind, I maintain a set of PGP files signed and keyed to each of my clients which I flippantly tell them to open "if I get hit by a bus", that contain all the SSH keys and details needed to get to their source files / DBs / webservers, extra code hints and lots of juicy advice for whatever unfortunate developer takes over after my demise. (Including shit like, this software is way too old, rewrite it).

If you saw what I write you'd say I was a 20x coder, but that's just from a youngin's perspective. I've been writing code since 1988. The basic value proposition for my clients is that they know I'll die at some point, but it's way cheaper and more efficient to pay one guy they know will get it done, and take the risk they'll have to scramble for someone else in less than 10 years (at which point they'll likely have to rewrite anyway). Also, the shit I write mostly maintains itself. I'm proud to say that my software in 24/7 production has had no actual crashes in years other than the occasional forced server upgrade. If I died, it would take awhile to notice if they didn't keep asking me for new features.


A pot of big corps have Hackathon weeks for doing just this. A free-for-all to build something you want and that you think will be helpful to the org from your perspective.


And as soon as those weeks are over everyone goes right back to their previously scheduled work.


The fact that launching any kind of work in production in a big organization makes it even more important to just focus on tiny changes (sometimes a few lines of code) with huge benefits.

You can have bigger changes as proof of concept first, but select, properly design and launch the most important one.


Exactly I'm living proof that seniority comes with the ability to spend more time on problems to understand all the details before committing stuff which no idea how that works.


Nice: Ah, FedEx founder F. Smith just booted himself out of the CEO slot and to just Chair of the BoD.

At one time he was scared of the problem of scheduling the fleet. Yup, if describe the problem in complicated terms, then the scheduling problem gets wildly complicated, e.g., in NP Complete, etc. -- could have a big team working months on just the first version. But he needed answers right away and, to satisfy concerns of some investors on the Board, needed the answers also to apply to the whole, planned company (eventually it grew to something much bigger than the initial plans).

I was in a meeting looking for a solution with several people, and it was all confused. Finally I just announced that I'd solve it.

I was teaching computer science at Georgetown and had 6 weeks to go to the end of the semester. So, I wanted to be done in six weeks. How'd I do that? Sure, just concentrated on what the business actually needed and threw out the rest. So, in six weeks had the software putting out nice schedules.

Our two representatives of BoD member General Dynamics and major investor evaluated the schedule and announced "It's a little tight in a few places but it's flyable.". The BoD was thrilled.

At one point Smith stated that my work "solved the most important problem" facing the company. The 10X, maybe 1000X, difference was essentially just in delivering what the business needed and dropping the rest.


The main thing is to keep the main thing the main thing. --Steve Covey


It is the ability to quickly cut through irrelevant tasks and actions to focus on the thing that will move the business goal forward fastest.

"Landing the plane" as they sometimes say.


This is going to vary wildly by niche. My neighborhood is line of business apps for small to medium sized teams and departments in corporate settings. - Knowing as much SQL as you can learn. - Being enough of a project manager to at least handle your own business. - Being enough of a public speaker to be able to evangelize your own work. - Being enough of a meeting and event planner to contribute to those sorts of things. - Having no real interest, maybe even an actual aversion, to the latest fads. - Being enough of a administrator to at least be able to build servers, set up a server's OS, install important software like web and database servers, and generally keep them running. - Basic corporate communications, technical writing, documentation. - Being able to manage everything so that the release cycles are tight. Maybe more than anything else, how quickly you can turn a business conversation into production code deployed to end users seems to determine a lot of success. - Being engaged and interested in your team, helping with staff meetings, contributing to things the boss is trying to do, generally being a "team guy". - Having a good working understanding of the entire life cycle of your software. For me that means understanding most of what happens when web browsers, the client OS, drivers, networks, network protocols, web servers, application code, and databases all operate smoothly to render HTML on peoples' screens.


Some less mathematical / core CS and more cloud / system engineering:

- Exponential backoff

- Jitter

- Consistent hashing

- Gossip algorithm

- Basic “Functional” aspects of a language (map, flatMap, reduce, filter, some, find)

- Putting things in a queue and using a “serverless” consumer

But some of the more day to day productivity boosters: know your development tools (including all keyboard shortcuts, including multi line editing, refactoring, grow shrink AST based selection, know all git / bash commands without googling…)

Know your programming language / framework deeply is also a supposedly simple one that just needs practice and dedication.


>But some of the more day to day productivity boosters: know your development tools (including all keyboard shortcuts, including multi line editing, refactoring, grow shrink AST based selection, know all git / bash commands without googling…)

Basic skills like this should be at the top of the list, including fast typing, using all available keyboard shortcuts, avoiding using the mouse but nevertheless being fast and accurate with it when required.

The amount of time people spend typing and using the mouse dwarfs any other time investment and if you improve your speed with these basic skills, the productivity multiplier can be enormous.


- Learn to code. (not everyone here codes)

- Once you learn how to code, learn how to automate. Design everything you can to be declarative.

- Learn how to speak and write well.

- Learn how to negotiate and influence people (read the "classics", eg "Never Split the Difference...", "How to Win Friends...", and so on.)

- Work to increase your EQ/Emotional Intelligence.

(Yes, these are all actually productivity multipliers.)


Will you explain what you mean by "Design everything you can to be declarative"?


I believe, by "Design everything you can to be declarative" runjake is talking about is above creating good abstractions that satisfy a particular use case, abstracting the user from the implementation details.

Wikipedia defines Declaration programming as "In computer science, declarative programming is a programming paradigm—a style of building the structure and elements of computer programs—that expresses the logic of a computation without describing its control flow." [1]

By decoupling a user from the low level imperative understanding of the system and focusing on the high level abstractions, if the interface is well designed, it enables the user to think at a higher level of abstraction and expands their boundaries in expressing what they want to achieve using the higher level constructs.

One of the most famous declarative interface is "React.JS" which allows to think at the level of the "components" [2], while in acuity, it's abstracting the DOM from the user as a virtual DOM [3] level and performing reconciliation[4] internally for rendering to the actual DOM. But, as an end use, they don't really have to think about these implementation details and can just work with the "components" abstraction.

Most user interfaces I have seen are abstracted in a declarative / markup language. The most famous declarative interface is SQL.

In relation, I would also recommend reading this wonderful article about leaky abstractions by Joel Spolsky to understand the flipside of abstractions, in general: https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...

--

1: https://en.wikipedia.org/wiki/Declarative_programming

2: https://reactjs.org/docs/components-and-props.html

3: https://reactjs.org/docs/faq-internals.html

4: https://reactjs.org/docs/reconciliation.html


Automate figuring out how to do things from a machine-readable specification of the desired result.


I think they mean if you can, don't define the execution yourself. Rather, specify what needs to be done, then have the machine work out what to execute.

Eg declare dependencies and have Make build things, specify SQL query and have the optimiser fetch the data


Also build tools in compositional scripts with good names


All those little bits of knowledge are nice tools and patterns, they're definitely good to have. And if you have curiosity, you will naturally keep learning dozens of those over time.

But they are not applicable everywhere, so you should not expect your 8 hour days to turn into 48 minutes because you started using regexes, dynamic programming, and search algorithms everywhere.

The unlikable and unhelpful answer, but probably the single biggest fundamental multiplier, is unfortunately the g factor. General intelligence.

The only 10x differences that genuinely exist are between people who are 2x faster than average and people who are 5x worse. Not because they learned a trick, but for a complex combination of reasons, including having been handed down two or three standard deviations in general intelligence.

The harsh reality is that there is no single thing a 1x person can do to become 10x.


You may wish to read Gould's Mismeasure of Man, and about thw fallacy of reification.

And brain plasticity, and the characteristics of high performing groups or institutions.

Changing the way your group works together is going to have a more profound difference on the outcomes than focusing on individual performance.


I think you're right that focusing on group effects will be more productive, but I probably reach the same conclusion for different reasons :)

I intuitively don't like the idea of the 10x programmer, my core argument is actually that you should not try to become a 10x programmer, and that there's no order of magnitude productivity you can gain in a vacuum, ignoring group dynamics. So you shouldn't focus on individual performance too much! There's not much you can do about it, anyways

Managing groups of people is hard, and there's often big inefficiencies that are going to be obvious to some people (engineers love to complain about management!) but that have a true multiplier effect when fixed. Because for the time invested, you have an effect on the whole team. That's a true multiplier.

I don't know what specifically you think I should learn about the fallacy of reification, but please feel free to elaborate.


> The only 10x differences that genuinely exist are between people who are 2x faster than average and people who are 5x worse.

The original idea of a 10x developer was between best and worst, not best and average.


> The harsh reality is that there is no single thing a 1x person can do to become 10x.

Any evidence for this? You may say there's no evidence that a 1x person can work up to 10x its performance, but stating it's impossible without evidence lacks merit.


Two answers to this.

The first I sort of already touched on below is that gaining a whole order of magnitude productivity, not just on a single task where a cool pattern applies but on average, is a really big effect size. If your colleagues start completing tasks that should take a month in 3 days, you would definitely notice.

The second order effect is that if we had discovered any method that reliably let an average person gain 10x productivity, it would have generated tremendous profit for companies and spread like wildfire. We haven't seen that, so we should be very suspicious when people claim a single bit of knowledge, method, or pattern is going to improve your performance by 10x.

(The other response, which is also annoying and unsatisfying, is that I'm making the boring claim that there is nothing interesting, so the burden of proof is actually not on me. If someone thinks there exists any intervention that can increase an arbitrary developer's performance by 10x, it's up to them to collect data and reject the null hypothesis)


There's no single thing.


The OP is not about 10x more productive, it is about 10x productivity boosts. Using regexes can easily be 10x faster than manual editing, and property-based tests can easily be written 10x faster than a more or less equivalent battery of tests.


I agree with you :)

I actually like those little patterns a lot, there are problems where you might absolutely take 10x longer because you didn't know the algorithm or the right machine learning concept.

The OP talks about becoming "significantly more productive when solving a large class of problems", and where my interpretation differs is what you mean by large class of problem.

What's large for you is really going to be relative and will vary based on your personal experience, so I can't say you're wrong. I know I speak at least for myself I when I say that in my day to day, most of what I do is not leetcode-type problems, so despite knowing many algorithms I don't think I would be 10x slower if I didn't know them.

If your interpretation of OP is that these boosts will on average give you 10x more productivity, that's the only place where I really disagree.


On the point of regexes, I think you might be misunderstanding OP. Regexes have absolutely given me a 10x productivity boost over the years, but not because I write production code that relies on regexes to parse things, extract things, etc.

The 10x productivity boost I get with regexes is being able to effectively automate out a lot of text manipulation on a large scale. A couple examples. Say I get a file with a bunch of junk in it that contains some IDs that I need to use as input to other commands. I write a simple regex in a text editor, find all, delete everything else, then use other regexes (or multiline editing) to wrap those IDs with the commands I actually want to run. A more impactful example. Several years ago I was writing a new loadtest suite, when I realized that it would look A LOT like some existing load tests we had. They weren't close enough for composition or inheritance to solve the issue completely (we were already both in a few places in the codebase to help out), so I used a similar process that I laid out above to write the new loadtest suite. I barely wrote any of the code manually, I was doing most of the work from a level of abstraction above, making broad code changes with regexes. I did it all in an afternoon.

To get the 10x multiplier from regexes you've got to basically bake them into everything you do.


That's entirely fair. I do the same, a lot. I like writing macros, using regexes, or even short awk scripts to automate any manual text workflows.

I'm not trying to deny your experience, so I can accept your word for it that it would take you 10x longer to complete tasks if you weren't able to do this.

At this point the core of the argument becomes more what sort of mix of tasks your daily work consists of, I think. I agree that there are patterns that solve some problems very efficiently, but in my daily life if I try to look at people who complete projects 10x faster than average, I can't point at any single method or bit of secret knowledge.

So I think the twofold response is first that the average developer is actually able to use search-and-replace and already has some vague notion that regexes are a tool they could reach for, so this is really saying that "0.1x" productivity exists in a way. And I think that's uncontroversial: if you imagine an hypothetical programmer who is still using the `ed` text-editor and hand-writing machine code to address business problems, in a world where people throw together microservices running in K8s in 100 lines of code, they're going to be a lot slower than average.

But I think the question is really about whether starting from an average (or if you want, median) dev there's any single thing that will give you a 10x productivity boost, that's not already something normal the average person knows about.

The second answer I have, and that I keep coming back to, is that if we had such a thing it would be more popular than coffee and morning meetings. It would automatically and very quickly have become something that companies rapidly adopt and that becomes the new normal (like search and replace!).

So finally the point is not that there aren't thing like regexes that slow you down a lot if you don't know them, it's more that there aren't any such things that are also well-kept secret. If the effect size is a whole order of magnitude, you won't have to scour forum threads for secret methods that turn you into a Mythical 10x Programmer, it will just quietly already be on its way to being the new 1x, and it will most likely be a gradual improvement over time.


Respectfully disagree. The compendium of skills integrated is the "Single thing"


Sure, but I feel like that's trying to argue semantics without really changing the nature of the problem.

If the 'single thing' is really a whole compendium of integrated skills, then you can't easily teach it to people the same way that you can teach them regular expressions or parser combinators.

This sort of nebulous skill that very productive people have is also especially hard to transfer. You can spend a week taking math lessons from Terrence Tao and seeing how he works, but you will still not be Terrence Tao at the end of the week, no matter how many interesting mathematical tools and methods you learn.

My point is that there is No Silver Bullet that will consistently turn your 10 hour days in 1 hour day, or if there is one I have never seen it, and evidently no company has found one. Otherwise all of their employees would follow the same method and become 10x programmers. This has not happened.

It helps to learn popular patterns and algorithms, but I don't think learning more patterns will achieve the kind of order of magnitude improvement you're looking for.


Solving problems iteratively in the same way progressive jpegs render: Start with essential parts, quickly assemble them to a usable draft and iterate on the entire thing by improving quality where it needs to be improved.

This is applies to programming as well as to writing, drawing, making music etc.

Applying this to writing emails can definitely improve your business communication 10-fold:

* write down the important parts you want to convey, even in incomplete sentences

* improve the ideas into sentences

* rearrange the paragraphs/parts/sentences into proper order (written language rarely comes out in the correct order).

* cut unnecessary talk

* iterate on the above 3-8 times, depending on the importance of the email.


> Solving problems iteratively in the same way progressive jpegs render: Start with essential parts, quickly assemble them to a usable draft and iterate on the entire thing by improving quality where it needs to be improved.

This is something that I feel I’m really good at, and I like this explanation a lot.

I’ll add that while being able to look at problems at a high level and then slowly increase “resolution” is a huge advantage, the key in my experience is the ability to switch between those levels of abstraction quickly.

The more complex a system, the more important this ability can be.

Let’s say you’re building out a large new feature that can be implemented in three parts: Service A, Service B, and the communications between them. That’s the highest level of abstraction: we are dealing with three components, each of which has a very high-level purpose/domain.

Once each component has a defined domain, we can disregard the other components when drilling down into each one. This works very well, but invariably we’ll reach a point where we discover than an assumption that was made at a higher level is no longer valid.

At that point, there are two options: we can stay this level of abstraction, leave our component domains unchanged, and “work around” the issue. Sometimes that’s fine and leads to relatively minor hacks or technical debt. Other times, it’s not fine, because the violated assumption can be much more easily remedied by going up a level of abstraction and changing the domains of our components. Often, the assumption that didn’t hold in Service A also impacts Service B, and doesn’t hold their either.

Maybe that will result in splitting the problem into one more service; maybe it will mean that the domains of each service need to be adjusted so the issue is isolated into one of them; maybe it means that we chose to split the problem up along the wrong boundaries to begin with and need to reconsider them entirely.

My point is - starting at the most abstract level and working your way down is powerful, but you don’t always have all the information necessary the first time you walk down that complexity tree.


Learning to adequately estimate probabilities and make decisions based on those probabilities is extremely powerful, and that's true whether you use those skills in your investments, your career, or your relationships. You can get a lot of practice on sites like Metaculus[1] and Manifold Markets[2].

I've also found a lot of useful discussion around these and similar skills on LessWrong, a site dedicated to the art of recognizing and correcting errors in reason and cognition. That site is full of 10x multipliers, particularly in their "Best Of" collection[3].

[1] https://www.metaculus.com/questions/

[2] https://manifold.markets/home

[3] https://www.lesswrong.com/bestoflesswrong


Has anyone tried to use these kinds of forecasting skills to improve software delivery estimates?

Software developers are notoriously terrible at estimation and lately I've been thinking that we should look to Superforecasters[1] for inspiration about how to improve.

[1] https://en.wikipedia.org/wiki/Superforecaster


You might find How to Measure Anything: Finding the Value of Intangibles in Business by Douglas Hubbard useful. - https://hubbardresearch.com/about/applied-information-econom...

1) https://hubbardresearch.com/

2) https://www.howtomeasureanything.com/


Knowing how to use your tools. Operating systems, window managers, editors, debuggers, cli&gui tools in general, keyboard shortcuts etc. I often find the lack of this pretty jarring, watching how slow and ineffective some people are when using their basic tooling, even though they are a couple of years into their career and have written a todo app in a dozen currently cool languages/framework on the side.

I feels a bit like watching a chef, that has all the theoretical knowledge about food composition and such, but takes 5 minutes to chop an onion.


I find general computing skills to be one of the best predictors for overall employee performance.


Stepping back and balancing the numbers: e.g. Mass balance, energy balance, balancing the books, balancing your schedule, profit analysis, etc.

Usually this comes down to simple math and a bit of black boxing. It's amazing how often people miss the big picture and therefore don't see problems, losses, hidden costs or even opportunities because they're not looking at how the inputs and outputs balance.

A related skill is developing a feel for (or be able to quickly estimate) the appropriate order-of-magnitude of something.

Nothing saves time and effort like being able to scan some numbers and immediately identify a problem.


I’m betting you’re a trained scientist or engineer.


I would like to challenge the underlaying idea:

Asking "what fundamental knowledge" is like asking what telescope makes the best astronomers.

The problem with knowledge is applicability. You can know a lot of fundamental things, but if you cannot recognize the patterns were they are useful is dead knowledge.

My personal experience through my life (30+ years in the field) and observing and interviewing developers in many different industries tells me that curiosity, logical thinking, and emotional intelligence win the day. And these are not pieces of knowledge, but personal characteristics and skills to develop. And they apply to many careers, including software engineering.

I think that more important than knowledge accumulated is your ability to conceptualize and recognize the concepts in a different context. Knowledge is just a side effect of this.


Seems like you vote for vector space embedding :) I’ve seen interesting paper about finding unknown embeddings and I think it’s important skill that AI also will poses.


this is an unhelpful response: the guy asks a straight question and the response is to "challenge the underlaying [sic] idea" It would be better to answer their question, rather than attempt to knock it down. (good example of zero-sum bias.)


It is actually pretty astute and on the topic to me. Skimming through here, a lot of people mention the fact that as they grow into their careers, one of the biggest gains is judgement about what sorts of problems even should be solved.

Or like. If someone came in asking "I'd like to make my car go faster, should I paint it green, or blue?" it's not unhelpful to point out that that's a solution to a different problem.

Plus anyway the audience is greater than just the person who initially posed the question. Conversations around this and other similar answers are vibrant today, so clearly they're helpful to someone.


Most people on hn don't have deep technical knowledge and don't like to be told that it matters


I have deep technical knowledge on some subjects, but how I got there is through curiosity and reasoning, like the parent commenter said.

There is knowledge you can acquire, and then there are skills that can allow you to acquire the right knowledge at the right time. For example, having a baseline understanding of logic and mathematical reasoning could help you to understand when a problem area might already be formalised, and to find that area and understand the work therein.


Unhelpful response (is indempotent)


"idempotent"


Correcting typos is such a more helpful response!


In the software development world I would suggest: knowing how to interface one's own software with existing systems to automate tasks.

I am thinking specifically about programming in pyhton/javascript/etc. and the act of requesting web pages, reading files, or databases and parsing this to quickly gather lots of available information about something. This could be a weather data scraper, reading all files from a directory to transform them, checking repeatedly if some signup for something important is open yet, pushing receipt pdfs from emails to an archive, ...

The examples all have in common that you need to know how e.g. REST/file/database access works and you can use these to gather information quickly and automatically. Knowing how to interface with common systems can save you lots of time, therefore I would see this as a productivity multiplier.


Regex has been very useful in my career. I would also add SQL as well.

Everything else on your list is highly domain specific and mostly useless outside of their domains.

> Calculus for solving all sorts of problems

I even like calculus, but the number of times i have used it in the tech industry is precisely zero (not counting big-oh notation which can be defined in terms of limits,because obviously that is not what is meant). There are specific domains where calculus comes in handy (i assume AI and graphics, im not really involved in either) but if you dont do that stuff its super useless.


I've built high scale and high available systems. I've leveraged calculus exactly once - I used the built in non-negative derivative function in Grafana to understand the rates of change in some metrics. Restated: I leveraged my calculus understanding to understand what the graph would represent. A couple of years of study to understand a definition on sight lol


(1) There are no magic bullets.

(2) End of list.

But seriously, "10x" aside, there are some techniques that have come up surprisingly often and been surprisingly effective during my career. A lot of them are around caching, cache coherency, memoization, etc. and I see many of those have already been mentioned. How to write a simple parser is another one, likewise.

The one I haven't seen mentioned (and don't expect to) is "poisoning" of data or requests. Sometimes it's not very convenient to delete a piece of data or turn a request around with an error at the very moment when the need for that is detected - often because of locking or reentrancy issues. Just marking something as invalid, until some time when removal is more convenient, can be a very powerful technique. It can also make monitoring such events easier.

I've brought this idea into a codebase at least a dozen times, and more often than not it has gotten a "wow, that makes things so much easier" reaction. I can't take credit, of course. Like many things around caching or state machines, I got it from the hardware folks where it's a standard technique. In general, software folks can learn a lot from looking at how hardware folks have solved similar problems in an even more challenging environment, and "bring back" those solutions to their own domain.


- understanding your tools: what are you using day in, day out. are there "tricks" or efficiencies you can gain by learning them better? can spending time improving or perfecting your abilities here offer marginal improvement that will pay back and compound over the long run?

- understanding your domain: having breadth and depth of knowledge in what you are building. what is the "pattern language" of your work? can you set up the scaffolding for your project faster/more efficiently each time because you have seen it before? are there common "gotchas" in your line of work that the uninitiated might not know?

- understanding yourself: how do you work best? times of the day, locations but also personal nutrition, sleep, exercise, etc? how can you remove obstacles to get into flow state when needed?

- understanding others: who are you working with? what unique relational aspects might improve the flow of discourse and work between them? can you cut through bureaucracy? can you word things for more effective impact? much like your own working preferences, what are the timing/style/mode/etc dynamics within a group that you can tweak to improve upon?


If you are doing too many parallel things and context switching become a PITA, have an org document for each of your tasks, write your thought, experiments with code and output, your decisions, keep a references list with links and other important things.

Is also quite commnon to go back to some tasks to copy some code or pattern which you used, this will save you a lot of time.

If you have ADHD, us it to keep focus, you can easily manage tasks inside an org document, even go to the point of using just the console for work if too many things distract you.

Also check where you waste more time, optimisation should be focused, trying to increase your performance in some task which is not a bottleneck will cost you more time than if you just don't do anything.

Most of the time I asked myself "what is the next step?" and "what I was doing?", as well creating small pieces of code which require a of plumbing/debugging to later on fit into a bigger system. So org is the most productive thing for me.

But consider what you want to improve, before trying to improve any/everything.


what is an org document? is it orgmode?


Relational algebra, set operations and SQL. I learnt the power of parallel queries on massive datasets when directly manipulated with relational algebra, set ops and SQL. For eg: Create Table XX as Select a, b, c from YY, ZZ where ...

Lambdas, async and Linq. Somewhat related to the previous point, but combining these three helps with writing expressive and powerful declarative and functional code.


> * Calculus for solving all sorts of problems.

I'm not sure if I've ever used calculus thinking for anything as a full stack developer. Never done much with calculating the rate of change of a rate of change or determining the area of an N dimensional shape. I've used many different packages and tools that are inspired by the basic ideas of calculus. But I've never had to consult a calculus text to write a given method.

In my experience, calculus and most forms of advanced mathematics are more "mental tools that might come in handy one day" than a 10x force multiplier.


What strikes me, is that everything you listed is under the "Lindy Effect", if it has been around for 30 years, you can expect to be around for the next 30 years.

Among those, I can also recommend: Relational Databases. Unix tools for text manipulation, like awk, grep, and sed. Make. Architecture patterns.


* Dynamic Programming and the related concept of Memoization.

* Recursion - most complex problems seem to have a recursive solution which you can actually implement in a language like Scheme or Haskell that does not limit stack size.

* State machines.

* Monte Carlo methods.

* Unification - see SICP.

* Hand-rolled parsers instead of regular expressions for debug-ability and readability.


Hi! could you recommend some books to learn more about it? And how did you apply those skills to your day-to-day job?


You're way too over-focused on tech trivia in your list

The biggest force multipliers what a friend described to me as a young teenager as "be an and" - don't just be a programmer, be a programmer and a lawyer

Don't just be a welder, be a welder and a baker

Don't just be an engineer, be an engineer and a writer

Etc

You may have heard Scott Adams call it a "talent stack"

Same concept

Be exposed to as many ideas as possible (both good and bad), so that you can draw metaphors and connections between them - and know when you're seeing yet another example of a bad idea, so you don't waste more time than necessary moving to a good idea


I love this idea. You don’t need to be at the top 1% of a field to do cool innovate stuff, just be good at two or three different fields and see what happens when you combine them. Currently going down this rabbit hole with computational archaeology and having a whale of a time :)


> just be good at two or three different fields and see what happens when you combine them.

"Jack of all trades, master of some." That's been my goal for most of my life.


The 300+ year-old quote is "A Jack of all trades is a master of none, but oftentimes better than a master of one"

Too often, we cut the first part off, and it sounds horrible!

The whole quote is highly insightful


Computational archeology sounds great, what do you do? Is there more demand?


It’s pretty fun! The field goes from ML image analysis on artifacts, such as trying to identify artists in pottery, through to satellite image analysis, LIDAR scanning, ancient DNA analysis. My friend does underwater archaeology in Japan and spends his time scuba diving and writing code to analyze the images he gets. I was keen on the field because it lets you get out the office for a few months a year for fieldwork. However I’m new to this so there’s a lot more for me to learn still! Starting an archaeology undergrad (distance learning) while working with a few research groups helping them with code in exchange for archaeology experience atm.

The best bit is some of the groups use languages like Haskell so the work itself, from a purely programming perspective, is interesting too!


This is complete bs, real innovation happens exclusively in the top 1 percent of the field. This is just coolaid that some people like to drink to de emphasize the value of real skill and inflate their own egos.


They aren't talking about innovation but productivity. The OP asked "what bits of fundamental knowledge are productivity multipliers" this should hold true at whatever percentile of a field you chose, top 1%, 10% etc. However, it should be noted that the top 1% of innovators in a given field often do not understand the impact of their innovation on a multi-disciplinary or business context.


Some of the innovation indeed happens at the edges of what we know, and we learn as human species collectively. But most innovation is about combining multiple fields, and cross pollinate new ideas. Especially in the business world, where there is no unsolved problem, everything has been seen before, and most of us are just paid to get to the solution fastest. Applying knowledge from other fields, can be a great benefit.


It depends.

Real innovation in art software for example could only be done by someone who is highly skilled and experienced in both artistry and software engineering and has an very open mind when it comes to design.

You get get things like Sonic Pi from people who are both musicians and developers.


I’d be interested to know how many interdisciplinary engineers you know who are the top 1% in two fields at the same time :)


*Kool-aid, you 1%-er.


guess it depends on whether it's the Jim Jones version, or the stuff that aids you in being cool ;)


This. I learned a lot of business and management skills on the side... sales, marketing, accounting, vendor negotiation, inventory, merchandise ordering, policy enforcement, staff training, firing people (and how much it sucks), etc... by helping run a furry convention for 5 years.

It's one of the best things I've ever done for my career. I enjoy casually listing that on my resume under volunteering. :)


This so much. I started doing UX and web design as a teenager and only later learned to code. This now helps me tremendously because I learned valuable lessons: In UX/UI design, you usually do like 10 variations of your design if you're not satisfied. This fixation on details often forces you to forget about the hole picture. In the end, you have to take a step back and try out other completely different approaches.


I appreciate this concept, but I'll need someone to tell me what the baker/welder is producing. At this point I NEED TO KNOW


I'd bet they can put a down a mean bead of fondant, or frost their joints well :P


I agree: Computing is terrific stuff, an historic opportunity, but by itself is like an ingot of steel -- needs an application. So, need steel AND something else. So, need computing AND something else. My something else is some pure/applied math that is an advantage in my startup.


I completely agree, you need to be good and 2 or more areas, not necessarily an expert at each of them. The combined effect is insane, and makes you very valuable.


Understanding of your organizations ways of working and making connections across teams.

Understanding how your organization works: This is not about sucking up to 'important' people (atleast not when it concerns productivity). This has got to do with understanding which how the different organizational parts work, so that you can be most effective. e.g. There's tons of planning for resources/budgets that happen once in a year and if you are aware and can put in your requirements into that list, it would help.

Making connections across teams: In large and complex organizations/products, a lot of knowledge is tribal. It helps to have friendly relations and people whom you could chat with informally in teams that you depend on. Many conflicts can be resolved faster.


Echoing sentiments expressed...

Automation: Coding automation, test automation, devops, Ci, cd; hopefully goes without saying these days, but it wasn't in your list and ought to be x10 at least.

Knowledge Transfer: be x10 while on holiday.

Delegation: for many its harder than learning something new.

There are tricks that make you personally more productive, but the key to overall productivity is effective team work, or all your productivity is x1.


Your list appeals to my inner geek. However, I suspect they are not the 10x multipliers in my practice. I may make an exception for knowing regex's and perl very very well.

My contribution to your list is a skill that I would really really like to have: the art of approximation. Specifically, in the style of Sanjoy Mahajan's book, Street-Fighting Mathematics: The Art of Educated Guessing and Opportunistic Problem Solving.

http://web.mit.edu/sanjoy/www/

After reading that book I have been slowly trying to accumulate approximation methods and metrics in whatever I read, in order to estimate and validate quickly.


For me the most notable one is the adoption of a GTD system for task tracking, which I did in concert with two related changes: starting inbox zero, and switching from bookmarking-and-categorizing to clipping-for-search reference material.

I started each of these around 9 years ago now, and can't imagine going back. The greatest impact isn't even literally on "getting things done", but on the time when I'm not working. I don't have to hold things in my brain. When I'm not working, I can forget about all that stuff and actually rest, or focus on other things. And when it is time to work, it's all right there. (And by work, I don't just mean the 9-5, but essentially anything that needs to get done, vs recreation. Although I do also use it to remember things I'd like to do someday for fun—everything from books I'd like to read and movies I'd like to watch to things like taking golf lessons, or buying a new helmet.)

We also of course use project tracking software for work, but I find it extremely beneficial to have my own personal system as well. Everything goes in there, from large, ongoing projects to 'remember to switch the laundry over'.

I wrote a blog post a few years back describing what works for me[1]. For the most part it still applies. I use Evernote, but would probably look at something like Joplin if I were setting it up today. (And will probably migrate, someday.) About the only other significant difference is I do a lot less sending emails to evernote now, since the gmail 'snooze' feature is pretty convenient. I'll still do it in cases where I want to add some notes though.

One thing to be aware of is if you're used to a less structured system (a written agenda, sticky notes, marked-unread emails, etc.) this will take significant time to adapt to. It was my primary focus for the better part of a month just to get things set up in a way that felt good to me, and then another couple of months before it stopped feeling like extra work to make myself use the system. But in the years since then, I can't imagine having lived without it. (And I built a fairly successful company and had kids over that period.)

[1] https://www.tempestblog.com/2017/08/16/how-i-stay-organized-...


The spacing effect:

The spacing effect demonstrates that learning is more effective when study sessions are spaced out. This effect shows that more information is encoded into long-term memory by spaced study sessions, also known as spaced repetition or spaced presentation, than by massed presentation ("cramming").

https://en.m.wikipedia.org/wiki/Spacing_effect

Spaced Repetition Software (like Anki) allows you to learn more things in less time – and remember them better, too. It's especially useful if you want to learn a large amount of things over a longer timeframe.


1. Super learning techniques Srs, memory palace, speed reading or time shifting video/audio

2. Model based thinking: Big book of mental models is a great place to start.

3. Learn how to train your dog properly (perhaps the single biggest life hack a person can do imho)


I'd be interested to hear you elaborate on 3 more. I think you're probably onto something with that, but it's hard to say without more details.


* Don't get too hung up on computational complexity.

A lot of times O(n^2) is gonna be perfectly fine. Write it the simple and easy to maintain way first, and only worry about fancy optimizations if it's too slow.


Optimize for humans first and computers second. Computers can read the crappiest, most spaghettified code on the planet as long as it is syntacticly and functionality correct. Humans cannot, and their time costs several orders of magnitude more.


in these days of modern CPUS with cache and branch prediction, O(n^2) often beats O(logn) because your data is tiny compared to the cost of all those cache misses and branch misses the seemingly faster algorithm creates. This doesn't apply when you data is large, but then you need parallel algorithms, and not in memory data (or just punt all your data to the sql server)


Yeah. n = 3 more often than you expect. n <= 12 far more often than you expect. If you don't work at a FAANG, don't automatically assume that your data and algorithm need to squeeze the last O(logn) possible out of the algorithm.


And if it is too slow a hash table or two will probably fix it.


Learn how to lead from any position. You don't need to be promoted to lead.

Do things like teaching and mentoring, build consensus around decisions, involve people in your thought process early, actively seek out feedback on your ideas, alleviate people's concerns, help out where you can, be empathetic, work freely with anyone in the organisation as needed, gather data to support your opinions.

You'll soon be at a hub in the informal "attention network" that truly decides who calls the shots. Handle it with care and use it for good.


There is no list. It's not possible to say, "I know these 5 things, and they'll make me more productive." Feynman solved all his physics problems with Calculus. That's it. One tool. Part of is is knowing your tools. The other part is being able to see a problem, break it down, and understand how to manipulate it into something that's attainable. That's it.

I worked with this guy Pat several years ago, and he was on a different level than the rest of us. PhD Mathematics etc. I remember the day that I understood what made him different than me. We were standing in the hallway to our suite in the office building, and the first time visiting he says "This will be easy to remember. Lots of 3's." I looked at him for a moment puzzling his response. He says, "2745, lots of 3s." Pat was better than me at programming, because he saw things in a fundamentally different way than I did. He did things like that all the time. He wasn't better because he knew C++ or Java more than I did. He was better, because he would approach a problem and find a shortcut, or an optimization that the rest of us just didn't see. If something just seemed off even if he didn't know the technology, he could go figure it out, because his grasp of what to look for was much greater. That is what makes someone 10x better.


>being able to see a problem, break it down, and understand how to manipulate it into something that's attainable

Wasn't this Alexander the Great's approach: divide and conquer?

Every problem gets solvable if you break it into enough pieces


> Every problem gets solvable if you break it into enough pieces

Someone needs to just sit down and break P vs NP into enough pieces.


Traveling Salesman is "solved" basically this way with modern map apps: it may not be "optimal", but it's close enough

And close enough, as the adage goes, is almost always "close enough"


> This will be easy to remember. Lots of 3's." I looked at him for a moment puzzling his response. He says, "2745, lots of 3s."

I cannot for the life of me figure out what he means by this


Best guess I have is that 27 = 9 * 3 = 3 * 3 * 3 and 45 = 9 * 5 = 3 * 3 * 5

No clue if that's it though. Wonder if this "Pat" explained it to OP and OP is just keeping us hanging....


The number is divisible by three I guess.


(3^3)3.3.5 ? Maybe…


Learn to delegate/outsource at all levels from technical to mathematical: the best kind of delegation is to tools/systems/languages/type-systems (eg. "outsource your attention to the type-system") since it's always cheap enough to be in-budget...


Why is delegation important?


Because otherwise you are solving problems someone else can solve just as easily. It doesn't mean be lazy, it means be selective about what you choose to do yourself. Its a big one in the seven habits book: Do, Plan, Delegate, Eliminate.


Fundamental Knowledge that are productivity multipliers:

This is go against the grain, but ignoring these kinds of generalized list. These things are very non-contextual and can be bad advice without the context they were meant for.

It's better to:

- Find people who are experts at things

- Find ways to talk to them on an informal level

- Learn how and why they are suggesting what they're suggesting

- Write and communicate what you learned/keeping a journal

Long term and deep learning will take you places. Quick trivia is a waste of time.


A lot of those are pretty domain specific. Calculus is 10x only because it lets you get into fields where calculus is 10x. I have no idea what a general everyday coder would do with it.

I could list tons of design patterns, like separate mutable and atomic readonly copies of the same data, but they wouldn't be universal, although patterns are wonderful.

The closest thing to a real 10x I know is laziness and respect for best practices, and what I call "decustomization", the art of removing any unusual technology from actual production systems.

Riced desktops? Gone. Custom made productivity apps? Gone. Simple bash scripts to move files around and do backups? Gone. Distro hopping? Extra gone. New JS frameworks? Gone but with napalm. Making your own programming language? Not gone because it never even started.

Any one of these things might not be a big time drain. But the general mindset of DIYing and building "just right" systems all together takes a lot of time.

Learning how to work "inside the box" with standard issue tools, and how to guess ahead of time how much maintenance work anything new will require, and to think in terms of how things fit ecosystems, rather than evaluating things separately, has probably been the most important thing I've learned in tech.

Every time I've done something "unusual" I've been convinced that the core design was better than what's out there. And almost every time, I've regretted it, because a great idea doesn't always make up for a lack of polish and compatibility.

The challenge of course is actually finding anything else you want to be doing with the saved time.


I agree, it’s not usually a good use of time to reinvent the wheel, but what if there are unacceptable (e.g.) security risks with all the existing solutions? Then “just right” means “adequately addresses realistic security risks, where core fails to do so.” Developers correctly focus on building things, these aren’t often solid from a security perspective, and sometimes that matters. Abstracted, my point is that your point is correct in one domain, and there’s probably always another, peripheral domain where that’s not true: underlying assumptions mean whole ecosystems are flawed when used for a specific purpose: security, high availability, parallelism, etc


I think developers almost always do address security, it's just that security is insanely hard and most aren't dedicating security specialists.

CVEs are almost always patching in hours to days on mainstream software, it's rare to just leave doors open, the difference seems to just be in how much people tolerate things that might possibly have an unknown risk, just because they are big or use dependencies, or because they allow a user to do something that might be a bad idea in some contexts(Like have an unencrypted hard drive).

If you need extreme security, or some other specialist requirement, you're total right, mainstream ecosystems can be unsuitable.


> Riced desktops? Gone. Custom made productivity apps? Gone. Simple bash scripts to move files around and do backups? Gone. Distro hopping? Extra gone. New JS frameworks? Gone but with napalm. Making your own programming language? Not gone because it never even started.

This sounds more like depression rather than something that helps you grow. Yes, in production and production like environments, be boring. It helps. But discorgaging people from getting into that is a path to dis-encouraged engineers. People experimenting with weird things leads to innovation and learning.


It's true, you can't discourage it too much in people's free time, because a lot of them seem to love tinkering and that's the whole reason they started, but we also don't need to make it an industry standard expectation that everyone writes a parser generator at some point.

Innovation beyond the random toys level is really hard. At best most things can become a proof of concept that inspires some corporate dev. Unless you really enjoy that kind of random garage project, it seems reasonable to just leave it to the people who do.

There's a lot to learn in the world of boring unoriginal software. You learn how to take a task and do it with high level building blocks, and the minimum possible custom code. You're essentially designing a "Plugin" for a system made of the entire current software ecosystem, treating dozens of unrelated apps in a workflow as if they were a suite, taking into account the fact that you know people will want to look at data in Excel or control your thing with a MIDI keyboard.

I've done lots of random side projects, but I'm not sure I learned all that much, other than to really appreciate the value of the off the shelf thing I later replaced them with.

It's hard to objectively measure that kind of general learning, so maybe I am better off for them, but I also think I picked up some bad NIH type habits that I later had to unlearn, and caused some of my biggest tech related regrets.

Then again, if I was in a field where we actually did real "Paradigm shift" type stuff, I'd probably feel differently. But even then I'd imagine learning Haskell and linear algebra might be a better use of time, rather than implementing a window manager.


I would say either a field requires calculus or not. If yes, then you need to know it, full stop. And if it doesn't you don't need it, full stop. It's not the case that if you need it, it makes you 10x more productive.

All architects needs architecture knowledge; you can't just say "knowing architecture makes architects 10x more productive".


If the field is just general programming then you could say calculus is a 10x career boost, because you can take on calculus related projects as opposed to being limited to mathless stuff.


Employ a virtual worker that schedules meetings and replies to low-level emails. Disconnect yourself from real-time communications during those hours and allow your VA to contact you in cases of urgently needed replies. I employ a lady 20 hours a week for this, so I get core time about 4 hours a day to do whatever I want. Even if it's pc gaming. Costs me $15,000 a year and is well worth it.


Can you elaborate on how this "virtual worker" works in practice? I'd love to learn more.


She logs in remotely on my email. To get her up to speed she watched what and how I replied to things for about a week, until I had her start drafting emails as replies that I would approve (or edit) and reply. After a while she just got it. She's based in the Philippines and works from 11p - 3a her local time. I proactively schedule my conflicts and times in my calendar, and then she can schedule anything in the open space.

I could just not answer for a few hours a day, but I have alot of people who expect a quick response. This solves that and gives me back my evenings to spend with the family.


This is fascinating, thanks for sharing. Did you use upwork or a similar platform?


"Calculus for solving all sorts of problems."

Is this really common? I know I'm not in a top tier job, so maybe they don't give dummies like me the hard problems, but I've never had to use calculus to solve a real world problem.


I had the same reaction. I've forgotten most of it, but that's because there have been zero times in the ~18 years (oh god...) since I had a calculus class that I've seen anything I could use it for, without just making up a reason to use it.


calculus is irreplaceable for understanding system dynamics. Every meta-stable computer system is an incarnate approximarion of some differential equations relating the different parts of the system and the way changes propagate from here to there. I don't often use numerical methods, but analysis of stability, classification of fixed point types, topological limits to what small changes to the system can do to the behavior do all come into play. The people enforcing backoff for retries, randomization or careful analysis of fatal vs. temp errors, are using those sorts of insights. while you can take ten years of industry experience to get a feel for these flows, you can also take a good DiffEq/Dynamic systems course and start out understanding the possible shapes at the start.


I mean, you could call that basic logic and just take a philosophy course. That would provide formal and informal methods of evaluation without the need for actual calculus. This informal understanding and balancing is more closely aligned with philosophy than it is to calculus.


Nope. You could call that basic logic... and you'd be wrong, because it's not basic logic.


How so? The comment clearly states that they aren't doing any formal calculations, not using numerical methods, and just getting approximate insights.

If I'm wrong, show me I'm wrong and post a substantive comment. Snarky comments like that are not inline with HN guidelines.


Not doing numerical analysis, but that's not the same as not doing formal analysis. I was thinking of things such as this fascinating paper, particularly the shape of the overall manifold that gives rise to Figure 2: https://sigops.org/s/conferences/hotos/2021/papers/hotos21-s...

The insights from these topics: https://en.wikipedia.org/wiki/List_of_dynamical_systems_and_...


It's not a snarky comment, it's literally, not actually basic logic. You had no factual basis for saying it was.

Basic logic is here: https://en.wikipedia.org/wiki/Mathematical_logic

Dynamical systems has stuff that is not basic logic such as this topological concept which is applicable in understanding how stuff works without doing an actual calculation: https://en.wikipedia.org/wiki/Limit_cycle

That's calculus. Not logic, not philosophy.


You misunderstand. This is the basic logic I was talking about. Note my comment mentioned philosophy.

Same for limit cycles - they make no mention or this type of calculation. The concept can exist in other domains such as systems thinking.

So concepts do exist outside the context of pure math. If you look at everything from a hard math perspective, then that is what you will see. You have to look at the context of the comments to understand. It's not necessarily calculus because the commenter might not be looking at it using those strict theories. There's no support for your assertion that it's not philosophy or not logic. It could coincidentally be similar to a calculus concept, but that doesn't mean that one is using calculus if they are approaching it from a different context that shares a similar concept.

https://en.m.wikipedia.org/wiki/Philosophy_of_logic


Limit cycles are a concrete example of "analysis of stability, classification of fixed point types, topological limits to what small changes to the system can do to the behavior."

This is calculus, not logic, not philosophy.


Did they say they're using limit cycles? They could be using less formal methods that are not calculus, like most people do. You have not spoken to the context or approach they are using. Similar concepts can exist in multiple domains with varying levels of formality.

Again, any evidence this is not logic or philosophy? Simply saying it isn't, is not a valid argument. Just because you assert a concept exists calculus, does not mean a similar one doesn't exist in another domain/context. Any response to the majority of my comment, especially around context of use? Or are you just trolling by repeating the same line in multiple comments with no substantive support for your claims? Honest question.


How does logic help you understand what system dynamics will arise from choosing a particular exponential backoff or making a choice about picking some method with a particular known error rate, as the original commenter mentioned?


I realize that I'm in a less common industry, but I often end up using calculus and DiffEqs in my day-to-day work working with drones.

To your point on system dynamics, queuing theory pairs really well with calculus on that as well. Being able to somewhat predict how a system is going to behave under load based on measurements and an understanding of how the queues work is super super valuable for decision making.


Just for personal investing I've used calculus amidst some basic modeling to estimate probability of getting certain outcomes under certain assumptions and compute an expected value. This is just to sanity-check my thinking, or to see what the market's thinking.

To understand basic financial concepts, like the present value of an annuity, or implied volatility, requires calculus.

In boring software, simple questions like, how often will a workload miss cache, and what optimizes the cost there, can be modeled... if you use calculus.

Also, in the past, I've casually been thrown into a couple of jobs where calculus was directly applicable to the work I was doing.


I took calculus in high school and did really well in it - but I honestly can’t say whether I’ve used it or not as an adult, because all of my mathematical knowledge kind of runs together in my head.

I realize it’s a big ask, but is there any way you’d be willing to put together an example of when and how you’ve used calculus for a specific purpose where it really shines?


Work-related, I built a backend of a database engine on top of a transactional distributed key-value store. When building a secondary index, suppose you traverse the primary index in a series of transactions, reading a batch of key/values and updating the secondary index. With the scheme that was used to build the index, this has the chance of conflicting with an incoming write, which will also try to update the secondary index. You'll have some kind of backoff/retry strategy. What is the optimal batch size? Treating it as a clean calculus problem with static, homogeneous workloads, you could model the probability of getting a conflict, the performance of a given batch size, set the derivative to zero and find the maximum. (In reality instead of modeling the probability, you can measure it.)

In another instance, I got a summer job working at a company that made an optics-related product, and that involved a bunch of numerical integration, Fourier transforms, some optimization problems in stuff like calibrating equipment. So naturally, there was calculus.

Fun-related, recently, I noticed that a long-term call option I was holding seemed surprisingly expensive compared to just buying the stock on margin, given current and expected future interest rates, and comparing against the option's theta and delta values. Then I realized I was assuming the stock only goes up, ignoring short-term vs. long term taxes, etc. To get a general sense of how an alternative strategy of just buying the stock and adjusting position sizes at a few price points might work out, in a back-of-the-envelope calculation, I considered this problem: on average, how many times will a Wiener process W(t) completely cross the small interval [a,a+h] in the time interval 0<t<T (for some T)? What's its asymptotic behavior with respect to h, T, and a? (This problem of hedging options with stock, surely, is studied much more in depth, but I wanted to understand the general picture.)

Edit: I'll add, that's also taking for granted a lot of stuff with infinite series in comp sci that is also part of calculus. Like, if you want to pick a random number from 1 to 5, you know you can pick from 1-8 and retry if it's too large, and with calculus you know it will average a finite number of tries (8/5, specifically) because the infinite series passes the ratio test.


Here is a new example. I was thinking about strategy as a pro golfer playing in a 72-hole tournament. Generally, you want to minimize expected value, get the lowest average score on each hole. It is obvious that near the end of the tournament, if you're behind, you want to play riskier golf, and if you're in the lead, you want to play less risky golf. That maximizes your chance of winning. This means instead of optimizing for expected value, you sacrifice expected value in exchange for higher or lower variance, respectively.

What I realized a couple of days ago is that even earlier in the tournament, you don't want to optimize expected value -- you always want to give some of that for higher variance. Why? Because in the variance/EV trade-off curve, the slope of the curve is zero at the maximum value, which means to add h units of variance, you give up o(h), realistically, O(h^2), expected value. So there is some value of h where the trade-off is worth it.

(Concrete example: Golf tournaments could be modeled as a coin-flipping contest, where you get +1 point for flipping heads, -1 point for flipping tails, with 72 coin tosses. Suppose you could pay a 0.01 point cost to double the point value, i.e. you get +1.99 or -2.01 as a result of the coin toss. Then you should, because doing so on the first coin tosses will increase your chance of winning.)


“ Chances are, every time you used your credit card, filled a prescription or pumped gasoline, a calculus formula was used to calculate it.”

https://study.com/academy/lesson/calculus-in-the-real-world....


Looks like a paywall.

Yes, a calculus formula might have been used somewhere in that process, and was used by the finance person and dev. That doesn't mean I, or any other person, needed to know it.

As a dev at a financial institution, I've never needed to implement calculus formulas. Most calculations and formulas I've dealt with are algebra and stats. These usually are just a matter of using standard libraries too. So most devs aren't implementing formulas themselves, but just utilizing standard libraries.


> Looks like a paywall.

That's odd. It loads up just fine for me with no paywall.

Check this out for more: https://allusesof.com/math/51-amazing-uses-of-calculus-in-re...

The point is that calculus is used to solve all sorts of problems. Sure, you can say "...not in my experience" but that's one experience and not representative. You just happen to a) not work in a realm where Calculus is used and/or b) work within that realm only on problems without Calculus.


I'm not saying it doesn't exist. I'm saying that it isn't commonly used. Objectively, a great majority of people can live their lives without even thinking about or performing calculus outside of the classroom.

That is why I questioned whether knowing calculus is really a piece of fundamental knowledge that allows a 10x performance increase. For the majority of people it's not going to help them. Even if they deal with calculus in their field, I don't see it providing a 10x increase because unless one is mathematician, the calculus portion of their job will be inconsequential and likely be automated or computed for them.


> Objectively, a great majority of people can live their lives without even thinking about or performing calculus

OK no argument there. The OP statement was "For example ... Calculus for solving all sorts of problems" so it appeared you were taking issue with the idea that Calculus can be used to solve all sorts of problems. Your argument is clearer now: even if Calculus can be used to solve all sorts of problems that doesn't mean that it is "fundamental knowledge" for most people.


The way I would put this is that it's important to be familiar with calculus, to know that it exists and have a sense of where it tends to pop up. From that familiarity, you can look up the details on demand. But being entirely unaware of the techniques of calculus would probably be hindrance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: