Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What is the most impactful thing you've built?
729 points by rafiki6 on Nov 18, 2022 | hide | past | favorite | 737 comments
I'll start. For me, I think the most impactful thing I've ever built was an internal application for a FX trading desk that eventually went on to handle billions in daily trades.

It didn't use any fancy frameworks, just plain old CRUD on Java.




During a centralisation of public school local servers to a data centre, I created a consolidated library enquiry system. It served over 2,000 libraries, had 330 million titles, and had about a million users. It was efficient enough to run off my laptop, if need be.

AFAIK it was one of the top five biggest library systems in the world at the time.

I was asked to add some features that would have been too difficult in the old distributed system. Things like reading competitions, recommended reading lists by age, etc…

I watched the effect of these changes — which took me mere days of effort to implement — and the combined result was that students read about a million additional books they would not have otherwise.

I’ve had a far greater effect on the literacy of our state than any educator by orders of magnitude and hardly anyone in the department of education even knows my name!

This was the project that made realise how huge the effort-to-effect ratio that can be when computers are involved…


Cool story! what languages, frameworks, etc did you use? Or are you about to tell me COBOL? :P


The legacy back-end system being migrated was Clipper + dBase III on DOS, which is reminiscent of COBOL.

The part I added was built with ASP.NET 2.0 on top of Microsoft SQL Server 2005, and was eventually upgraded to 4.0 and 2008 respectively.

The only magic sauce was the use of SQLCLR to embed a few small snippets of C# code into the SQL Server database engine. This allowed the full-text indexing to be specialised for the high level data partitioning. Without this, searches would have taken up to ten seconds. With this custom search the p90 response time was about 15 milliseconds! I believe PostgreSQL is the only other popular database engine out there that allows this level of fine-tuned custom indexing.


p90 for a full-text search on 330 million documents was 15ms?

I know you can tune the hell out of search performance, but that seems a bit too insane for what looks like a relatively unspecialized setup (Standard DB).


Not likely the full book, just title, author and a few other low cardinality fields I'm sure. Also not likely 330 million unique volumes, but total books. This is within reach of a single database with proper indexing.


Can you elaborate a little bit more about how you partitioned it?


I simply added the "library id" as a prefix to almost every table's primary key. Every lookup specified it with an equality filter, so essentially it was thousands of standalone libraries in a single schema.

One hiccup was that when the query cardinality estimator got confused, it would occasionally ignore the partition prefix and do a full scan somewhere, bloating the results by a factor of 2000x! This would cause dramatic slowdowns randomly, and then the DB engine would often cache the inefficient query plan, making things slow until it got rebooted.

This is a very deep rabbit hole to go down. For example, many large cloud vendors have an Iron Rule that relational databases must never be used, because they're concerned precisely about this issue occurring, except at a vastly greater scale.

I could have used actual database partitioning, but I discovered it had undesirable side-effects for some cross-library queries. However, for typical queries this would have "enforced" the use of the partitioning key, side-stepping the problem the cloud vendors have.

Modern versions of SQL Server have all sorts of features to correct or avoid inefficient query plans. E.g.: Query Store can track the "good" and "bad" version of each plan for a query and then after sufficient samples start enforcing the good one. That would have been useful back in 2007 but wasn't available, so I spent about a month doing the same thing but by hand.


This makes the performance a lot more understandable if you're only searching in a single library. I assume that cuts out >99.9% of those 330 million documents.


is this symphony or horizon or spydus or koha? or?


All those commercial systems existed before his (going by his use of SQL 2005).


ah. i was just listing the ones that i could think of that use the concept of a library id.


> hardly anyone in the department of education even knows my name!

it's okay sir, we now know you as jiggawatts


... and this is how OCLC was created?


> had a far greater effect on the literacy of our state than any educator by orders of magnitude

Nice work, but check your ego mate. Seems your growth hacking would have had zero result if those kids couldn't read to start with, so you could share some credit ;-)


"Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith."

https://news.ycombinator.com/newsguidelines.html


Maybe it wasn't meant that way. If they hadn't had been there then somone else would have been. You can be on the crest of a wave and not be responsible for its power.


By that logic, the people who farm the trees that make the books have more impact than anyone before them, unless you want to consider the people that make the tools, or feed the farmers, etc etc.


Any more info? This is fascinating.


> This was the project that made realise how huge the effort-to-effect ratio that can be when computers are involved

I love Steve Jobs' metaphor: computers as a bicycle of the mind [0]. Unfortunately, a lot of effort is concentrated on problems that scale to billions of people. There's a lack of attention to problems that would have a big effect for a relatively small number of people. It's a shame, because they're a blast to work on.

[0] https://www.youtube.com/watch?v=L40B08nWoMk


They really are. I think the most rewarding software I ever wrote was my first paid gig, where I automated lap swim scheduling for my local swim club. Took me maybe an hour, got paid more money than I'd make in two days as a lifeguard, and they were thanking ME for it. Turned out I had saved a volunteer upwards of an hour every week. With a shitty little JavaScript program.


The first time I heard that metaphor, I thought that he meant it in the way that bicycles are really fun to ride. I agree with both interpretations.


I have a bunch of small scale ideas that I want to implement. Not necessarily for profit. Any ideas on how to execute?


Pick one that scratches one of your itches, and get started. Release early, keep iterations small. It's easier to keep working on something you actively use.


Impactful?

Designed and deployed credit card readers used in gas pumps back in 1979. (Sold to Gasboy)

Wrote a fine tuner to allow communication between satellites (precursor to TDRSS days). Still used to this day.

Failover of IP in ATM switches (VVRP, PXE, secondary DHCP, secondary DNS, secondary LDAP, secondary NFS). While not invented here, it is still used today as this is a Common setup to this day.

Printer drivers for big, big high-speed Xerox printers on BSD. Still used to this day by big, big high-speed printers.

Also, early IDS products (pre-Snort) at line-speed. Sold to Netscreen.

Easy zero-setup of DSL modem before some BellCore decided to complicate things (thus exploding their field deployment budgets; Southwestern Bell/Qwest enjoyed our profitable zero-setup). Sold to Siemens.

1Gps IDS/IPS before selling it to 3Com/Hewlett-Packard Packard.

Now, I'm dabbling in a few startups (JavaScript HIDS, Silent Connections, replacing the systemd-temp).

Impact? It is more about personal pride but its impacts are still being felt today.


OMG, Who are you?


  user: egberts1
  created: May 5, 2015
  karma: 1337
chefkiss


Leet!


Razor and Blade?!?

Hack the Planet


How did you find all these product market fits?

Have you made more than a typical SWE?


It was actually a wandering hyperactive/ADHD mind that often said "why isn't there one" and follows through doggedly to the very end.

It is one of those traits where a mind clicks and said "this is it and how" and surprisingly gets into the most illusive hyperfocus/high-energy mode (without using any drug).

Slow-path network processing (arguably me) was commercially made in Ascom Timeplex in 1982 and someone else leaked it to Cisco (or ripping AT's patent off). I got that from observing how different river bends (re)connect year-after-year while doing trout fishing trips.

Money-wise, I am disabled, got abled, disabled again in different way, re-enabled, now just coasting with my own ideas: JavaScript Host-Based Intrusion Detection/Protection System, being one of them. And an portable AirPod detector (for home/auto/travel) is another idea. And DNSSEC for within private enterprise is almost done.

Money is not my thing but it does help greatly in the pursuit of my ideals (so many hardwares, so many test equips).


How did you get disabled?


A bacterial infection. Differently twice.


Wear a rubber next time.


Kinda hard to do, I do sorta have to breath, ya know.


Just not sure it would have helped much. Think Civil War battlefield infection.


can you please explain what is JavaScript Host-Based Intrusion Detection/Protection System?


It is simple. Too many malicious and privacy-violating JavaScript abounds, especially after being boiled down to seemingly-indecipherable WebAssembly bytecode.

And a typical enterprise NIDS would not be able to see beyond those encrypted packet containing JS over 2-way-signed TLS/SSL, or HTTPv3 (QUIC) (or a few other E2E protocols).

Since JavaScript won't be banned (unlike Adobe Flash/ActionScript, BTW Adobe's JavaScript is still being used within PDF files) anytime soon, this is another example of seeing a void and rushing to fill its need for the betterment of Internet citizens.

Just yesterday, another "this is it and how" moment came to me: this Python PDF guy (and a few PDF experts) got me thinking "this is how to remove or make inert the JavaScript inside PDF": https://news.ycombinator.com/item?id=33646951


My understanding is that JS/WASM is sandboxed within the browser environment. It does have some access (say Camera) but only if you allow it.

Care to develop more on the potential attacks here?



Quick and dirty cleanup - convert to, then from postscript.


I absolutely love your thinking. What you propose does is defang the programmability aspect into an inert (but safer) "text-based" form.

But which side should assume the responsibility of this JS-defanging effort into text-based? Client or server? Postal said "be liberal in what you receive and conservative in what you send". So, being conservative (in this respect), server has to be minimalistic (including denial of programmability).

Real problem remains, too much accessibility of programming is being made available to let client-side take it in ... in a gullible way.

And no amount of Sideshow Barker (not a dig on HN's Sideshow Barker) can fix this, until one of the MAANG decides "enough".

Meanwhile, the wild Wild West shall continue.


[flagged]


I do enjoy sharing the fruit of my labor; but I share what money I have as well.

But, I "share" my money with those who provided me and others with things, like farmers, truckers, construction workers, plumbers, electricians, architects, textile workers, drafters, crafts-folks, artists, custodial, medical specialists, government workers, educational specialists, sanitation folks, engineers, engineers, engineers. Did I repeat that? Yes, more engineers.

I'm quite sure you do share your money too (and probably may not know the true extent of your reach).



At work: the CDN for Megaupload. I was also the guy who had to shut it down when the FBI seized it.

Personal non-code project: The first adult LEGO fan conference in 2000. While I got out of that business years ago it has been replicated by dozens of other annual cons around the world. Back then the LEGO group didn't really understand and was very weary of adult fans. Now there's a whole reality tv show about them with LEGO designers as the judges, and LEGO actively supports cons and clubs.

Open source project: A project I released anonymously ~2010. Several github repos (unrelated to me) keep this project alive (the main one has ~600 stars and ~200 forks) and it's apparently used in several commercial products too.

Website: ip4.me/ip6.me serves 3-5M queries per day. I want to find a good non-profit to take this over to keep it ad and javascript free forever.


Thank you for the Lego thing.

My mom got into adult lego when she took apart my child hood lego and reassembled them to resell.

Now we mail each other sets that the other is done with, and it gives us a great opportunity to connect. We're both anxious people and there's something relaxing about just assembling something where everything has a place.

When she found out there's a lego con in my town, she made plans to come visit me so we can go together and I can show her around the city I just moved to.


That’s a wonderful story!

My 3 and 6 year old love lego kits. Historically I found myself sitting with them and helping when they got stuck or directing them when I saw they made a mistake. More recently I decided to pick up my own kit and build along side them. I’m currently working on the Saturn V rocket. It’s been a lot more fun for me and a way to bond with my kids.


I am also in the middle of building the Saturn V Rocket. I picked it up after seeing that Amazon was stocking it again, which I thought was awesome since I missed out on the first run of them. I just got done with bag three. I haven't played with Lego since my teens. I'm in my mid-late 40's now. My kids are grown and I'm widowed, and I find a lot of peace in a lot of pieces ;)


>Website: ip4.me/ip6.me

>At work: the CDN for Megaupload. I was also the guy who had to shut it down when the FBI seized it.

>adult LEGO fan conference

Wow, what a small world. That's what I love about HN. The people that make things you use are on it :)

I wish I had something nearly as impressive. I just have open source stuff that people use. Nothing recognizable though.


Please tell me you're legally allowed to talk more about Megaupload and the work you did - sounds like an absolutely amazing blog post, would love to hear as much as you're able to discuss.

Also, I have a project in production at work where a device needs to grab its public IP address. My code has a list of sites that provide that info and I have ip4.me as a fallback in that list, so thank you for building it!


Legoland in my city still requires adults to be accompanied by children to enter. Kind of bizarre.


That doesn’t surprise me. It’s a very child focused park and I’m guessing they want to control the experience and environment as best they can. A bunch of high schoolers running around might change the dynamic.


I am charmed by the innocence of this comment.


I am appalled by the lack of statistical awareness of this comment.


I am astonished by lack of PR impact proportionality intuition of this comment :) (its not like we are talking here about black swan events - and one is more than enough for profits go down the drain)


I greatly enjoyed this comment.


> I want to find a good non-profit to take this over to keep it ad and javascript free forever.

Maybe worth reaching out to Mozilla. That's the only actual non-profit I can think of who I think would have both the ability and the incentive to keep it online.


> I think would have both the ability and the incentive to keep it online.

Ability? 5M/day for "what's my ip" is not much, and I'd wager most of us on this site would be able to keep it up and alive just fine. As for incentive... in addition to the Mozilla Foundation, orgs like Calyx, NLNet, Quad9 come to mind.


You are correct it uses very little resources, especially since most queries are http instead of https. Since there are no user accounts and it doesn't track anyone it doesn't even have a backend database to connect to. Just a couple dirt simple programs written in C with some very easy to remember domain names.

I'm not getting any younger so it's really about survivability. Transferring to another individual HN'er probably wouldn't solve that.


I'm 25 and I'd totally take it on, but you're probably looking for an actual nonprofit if you want true survivability. As someone who has used this tool for at least 10+ years (as long as I've known what an IP address is), I'd love to help make sure it stays around.


Maybe it’s an option to found a nonprofit especially for this goal?

If it doesn’t already exist anyway.


The process is quite complex so it's usually too heavy handed for a project of this size. That's why people look for an existing nonprofit to take it on. But if kloch mainly wants a couple other people who care about the project to be around to make sure it continues, I'm onboard.


A non-technical nonprofit will fuck up regardless of the load. Beyond "keeping it online", it can be something as simple as "knowing how to configure the dns for it".


or maybe the internet archive?


what a chad


Probably this JavaScript function I posted on my blog in 2003 https://simonwillison.net/2003/Mar/25/getElementsBySelector/


Wow. You were the original querySelector. It's funny how you forget that somebody actually sat down and wrote these things into existence at some point. Thanks!


Even more impressive to me is writing things into existence without the benefit of being able to dig in to the underlying browser tech, and only being able to use the public (at the time) DOM APIs like getElementById, etc.


He's understating, perhaps on purpose.

Datasette, Django, and Lanyrd.


To be fair the original question was "most" impactful thing...


Ten years ago I was reading [0] and I remember your name was mentioned somewhere. Here is a quote:

> Locating elements by their class name is a widespread technique popularized by Simon Willison (http://simon.incutio.com) in 2003 and originally written by Andrew Hayward (http://www.mooncalf.me.uk)

[0] Page 91 from "Pro JavaScript Techniques" by John Resig.


Yeah here's Andy's getElementsByClassName post (via the Internet Archive): https://web.archive.org/web/20030402172546/http://blog.moonc...


Hey Simon, thanks for creating Django with Adrian. I was deeply interested in programming from a young age but learning Django in my teens sparked a passion for web development that has yet to feign so many years later! Appreciate all your contributions to this space.


OMG.

My most impactful thing I've done outside of paid work is a website running on Django. I could live without queryBySelector or their descendants, but not without Django.

Thank you, Simon.


wane


   /* That revolting regular expression explained 
   /^(\w+)\[(\w+)([=~\|\^\$\*]?)=?"?([^\]"]*)"?\]$/
     \---/  \---/\-------------/    \-------/
       |      |         |               |
       |      |         |           The value
       |      |    ~,|,^,$,* or =
       |   Attribute 
      Tag
   */
all regexes should have ascii art explainers!

(from https://static.simonwillison.net/static/2003/getElementsBySe...)


What a coincidence. Just yesterday i've used getElementsBySelector for the first time while making a greasemonkey script.


Wow, 10 years before document.querySelectorAll()!


> Wow, 10 years before document.querySelectorAll()

querySelectorAll wouldn't ever appear without jQuery which got its idea from Simon's idea.

And even then querySelectorAll was so poorly implemented that it didn't even have any useful helper methods.


I like seeing this. At the time I remember thinking we needed something like this, and why doesn’t the browser have it already?

Then thinking, I suppose you could do it by (exactly the method you used), but never actually doing it because if it were that simple, someone would have already done it.

Actually, seeing the date, I realize this predates me even leaving high-school, which makes it even more atrocious that I never knew of it!


IOU 1 beer.


A pipeline approval tool (internal at Amazon) that counts metrics.

I was a fairly fresh college-hire SDE1 at Amazon. And I was annoyed, because I'm lazy. Every time I was oncall, I had to manage the deployment pipeline for my teams software- the UI for the tool used by Pickers inside Amazon Warehouses. On Monday, deploy the latest changes to the China stack (small). On Tuesday, check if anything bad happened, and then deploy to the Japan stack (small-ish). On Wednesday, Europe (big). Thursday, North America (biggest). Repeat each week.

And I thought "why am I doing this? There are APIs for all of this stuff!". So I made an automated workflow that hooked into the pipeline system. You gave a metric to look for, a count of how many times the thing should have happened, and an alarm to monitor. If everything looks good, it approves. I hooked it up for my pipeline, and then it usually finished the entire weekly push before Tuesday afternoon. I made it in about 2 weekends on my own time.

And I left it open for anyone in the company to configure for their own pipelines. A few weeks later I was checking if it was still operating normally and realized there were something like 50 teams using it. Then 100. Then a lot more.

The last I heard, it's considered a best practice for all teams within the company to use it on their pipelines. Before I left in 2021, it was running something like 10,000 approval workflows per day.

I named it after the BBQ/grilling meat thermometer in my kitchen drawer- "RediFork". Given the overlap of "people who read HN" and "devs who worked at Amazon", I probably saved someone reading this an aggregate hour or two of work.


I had always wondered why it was called "RediFork"... thought it might have been using Redis or something.

Thank you for creating it!


Literally stole it from this: https://www.amazon.com/Maverick-RediFork-Rapid-Matrix-Thermo...

Eg: Stick a fork in it and see if it's done yet


Holy shit. Forget counting the hours (which are easily into the tens, if not hundreds) you saved me and my teams - more importantly, RediFork was a great "on-ramp" for easily introducing observability in existing services without instrumenting anything new, and a great way to demonstrate the power of automation to newbies.

From one "engineer whose irritation at inefficiency spawned a whole tool" to another (I got sick of staying up overnight to run load tests, so wrote myself an automation and monitoring tool - which got picked up, spun off to its own team, and now is used by >300 teams) - thank you!


You saved Amazon a lot of wasted man hours. I hope they compensated you well


Hope so too but

> I made it in about 2 weekends on my own time.


In 1995 I (and a few others) designed and built the first WiFi node [1]. At the time there was only one WiFi unit in the world, and it was the one on our bench. It now has about 20 billion descendants.

[1] https://sci-hub.se/https://doi.org/10.1109/40.566198


Awesome. What made it win? Were there any close competitors? Was infrared close to being the winner? I'm also surprised big enough FPGA was already around. Thanks.


> What made it win?

Initially, ignoring the wisdom of the time that said OFDM was no good for indoor channels. The research project was eventually shut down due to lack of commercial interest, but the research leaders had enough faith to immediately start their own company (Radiata). Later, commercial success for Radiata came from being in the right place at the right time.

> Were there any close competitors?

In the research phase, not that I was aware of. In the commercial phase, Atheros. The story I was told after the event was that Cisco had decided to buy whichever company came to market first. Radiata came to market 2 weeks before Atheros and so Radiata was acquired.

> Was infrared close to being the winner?

It could have been, but specular reflection in IR channels causes inter-symbol interference, which limits the data rate. If someone could have solved that problem then IR might have happened instead of WiFi.

> I'm also surprised big enough FPGA was already around.

At the start of the project FPGAs were not big enough, so we had to partition across multiple 3000 series Xilinx parts. Bigger FPGAs had been released by the end of the project, so the transmitter fitted on a single XC4025 FPGA, using manual placement. The 4025s were brand new and Xilinx (as always) were difficult to deal with, so we had to beg for devices and they magnanimously granted us 3 or 4 chips.

At the time there wasn't much sense of occasion, as we were busy doing the work and none of us knew how big it would get.


CSIRO ftw. Now if only in my part of the world we could get CSIRO scientitists onto the same actual page as scientists from another 'industry' council, and academia, instead of them just paying lip service to each other in the public forum but actively undermining each other behind closed doors.


Most of the team was Macquarie University.

At the time the collaboration with CSIRO worked quite well, as there were no business development types involved. In 1995 CSIRO was more concerned about science than IP. Since then they have become more money/IP focused. Maybe they got gold fever from the $1 billion in royalties they made from their WiFi patent?


First job out of college, I was at a consulting firm doing software development for DHS (Homeland Security). I got a lot of flack from my friends and family for "working for the devil", but the work was actually objectively good for society - basically there was a big data problem where when an immigrant trying to illegally cross into the US was apprehended, and if they were sick, their custody would be transferred from US Customs and Border Protection (CBP) to Health and Human Services (HHS) so they could receive medical attention. There was zero data transparency between these two orgs, so when that transfer happened it usually caused families to be separated (Sick dad, healthy mom and child, sick dad gets brought in for care and never finds his family again). Since HHS and CBP don't have data communication and everything is siloed, the handoff was really poor and they often wouldn't find each other for months afterwards.

There was a lot of talk about this in the news, and although the software I was working on didn't entirely fix the problem, it allowed the agencies to communicate better. Their data wasn't siloed, and families got separated for only a few days rather than (sometimes) permanently.

I really miss that job. The pay was atrocious and zero WLB, but everyone agreed it was an important problem to solve, and I think the tool we had built really was helping.


Sadly true and relatable. Thanks for fighting that good fight.


Wow, that's fantastic work! Hopefully you haven't been turned off of gov work forever, there are now more and more programs to bring in tech talent (e.g. the Presidential Innovation Fellowship, TechCongress, USDS, U.S. Digital Corps) at more reasonable pay scales for impactful roles. May be worth checking out if you want to do it again


Let's Encrypt (along with the coauthors of https://abetterinternet.net/documents/letsencryptCCS2019.pdf and many other contributors). Now the world's largest public certificate authority!


You win. I'm only the co-founder of Digitalocean. I tip my hat to one of the giants we all stand on the shoulders of.


The Let's Encrypt team was a large group effort across three organizations, plus a fourth organization that was spun off to develop and run it, and I in turn tip my hat to all of my colleagues. :-)

(Including Peter Eckersley https://en.wikipedia.org/wiki/Peter_Eckersley_(computer_scie... who passed away earlier this fall at just 43.)


And yours is what I run my business on. Thank you!


I refused to use certificates for my own projects as it was too complicated and expensive, until LE came along. Thank you!


Just to say thank you for Let's Encrypt - it's saved my bacon on at least a dozen times in the past six months alone.


Let's Encrypt is incredible. Thank you!


I built a WordPress plugin that helps you to generate free SSL certificate using Lets Encrypt. At it’s peak, it was being actively used by 50,000+ [https://wordpress.org/plugins/ssl-zen/]


I think we all owe you, your co-authors, and the sponsors a big thanks.


Thank you so much for this! I use it on my personal site and it was as simple as configuring a few cron jobs!


You are doing the lords work. Great job!!!


A temporary low resource form for people in Puerto Rico to send an SMS message out to family outside of PR after Hurricane Maria.

During Hurricane Maria most of Puerto Rico was offline. Slowly but surely, some people started having access to some online services. To this day, I don't know how, but I saw frequent posts in social media (Facebook and others) of people saying they could access spotty internet but SMS and making calls wasn't working, and asking people to let their family outside of Puerto Rico know that they were okay.

So I setup a site on glitch.com with real simple 2 field form. One for a phone number and another for a message to send. It was dead simple, no framework, no CSS, just little bits of vanilla HTML and JS, and a bit of backend code connected to Twilio. Some text on the top with instructions too. I was making it intentionally small so that a spotty connection wouldn't have a problem using it.

Any time I saw someone posting in social media asking for someone to reach out to their family, I posted a link. I also shared it in a slack where many from the PR diaspora where trying to contribute ways to help. Before I knew thousands of people were using it. I did some continuous monitoring to make sure nobody was using it for abuse, and making sure it was being used as intended. It would have been EXTREMELY easy for someone to abuse it if they wanted to.

No one abused it. Thousands used it as it was intended. Left it up for weeks, and I kept monitoring it to make sure it wasn't being abused. I eventually saw it had stopped being used entirely for two weeks and spun it down.

I saw some people posting about it afterwards being thankful they were able to receive messages from their family, and I'm happy I rushed through to write very sloppy high impact code.


Awesome story, well done putting it out there, glad nobody abused it and thanks for sharing!


A hotel concierge that’s helped 50 million guests during their stay. The goal is to create unforgettable experiences for a billion people!

Ivy sends you a text message introducing herself as a virtual concierge when you check in. She answers FAQs in 1 second using NLP and routes anything more complex to the front desk team for resolution in 2-3 minutes. All in one simple text thread, no apps or UI needed.

Guests often come to the front desk trying to tip Ivy, rave about her in reviews, ask her out on dates, and even drop off hand written thank you notes for her.

One woman texted Ivy in a panic asking about the nearest drug store to buy Benadryl because her son was having a severe allergic reaction. A guest service agent brought Benadryl to her door in 3 minutes at a large Las Vegas property. She called Ivy a life saver.


Is this a startup? Does it have a name?


Yes, the startup was named Go Moment. Got acquired a year ago by one of the world's largest hotel tech companies (Revinate) where Ivy continues to grow and serve more guests. More info at https://rajsinghla.com/about


Found this blog post about it [0] but not a landing page for the actual product.

[0] https://chatbotsmagazine.com/ivy-is-a-a-24-hour-virtual-conc...


That is awesome. Great work.


Thank you for the kind words. Hotel tech/travel tech is a crowded and tough space to build in. Rewarding to see things work better for real people though!


Monodevelop, I think: https://www.monodevelop.com

It wasn't a planned thing. I had recently got injured playing football, so I was stuck at home, not being able to walk or drive. I started checking the #mono IRC channel (it was 2003 and internet was something you did over a 48k modem, when your home phone line was not needed). Some guys, lead by Miguel de Icaza, the founder of Gnome, were implementing a compiler of C# and a bytecode interpreter of .NET IL, and I was very curious about it. I kept downloading, compiling and trying things out.

Then one day Miguel wrote in the channel that it would be nice to have some graphical editor and that somebody could perhaps port SharpDevelop over to Linux, by replacing Windows.Forms by calls to GTK. I said that I'd give it a shot and... well, 10 days later we had a working editor and half a dozen of contributors.

https://tirania.org/blog/archive/2008/Mar-14.html


I love MonoDevelop, I used it to write C# for Linux before the .NET core days


I guess my most impactful project was a microprocessor-based weather station for siting wind energy systems and fruit frost prediction in the early 1980s. Turned out that one of my stations, being used by a frost predictor, was across the street from a rural drainage ditch in which a young child was discovered face down in the water. The frost predictor faxed temperature profiles for the previous several hours to the hospital, where doctors determined the child could be revived. She was.


wait... they won't try to revive a child unless they can first prove that said child can be revived? Why not just... try to do it regardless and hope for the best?

Also, as a new parent, my immediate thought is of course "WHO wasn't watching the kid??"


Just a guess, but it might have something to do with whether the brain is able to be saved vs. just the body.


yes but why wouldn't you try regardless of knowing? In the time you spend gathering temp data, you could already be reviving


Because chances are extremely slim:

> Various degrees of hypothermia may be deliberately induced in medicine for purposes of treatment of brain injury, or lowering metabolism so that total brain ischemia can be tolerated for a short time. Deep hypothermic circulatory arrest is a medical technique in which the brain is cooled as low as 10 °C, which allows the heart to be stopped and blood pressure to be lowered to zero, for the treatment of aneurysms and other circulatory problems that do not tolerate arterial pressure or blood flow. The time limit for this technique, as also for accidental arrest in ice water (which internal temperatures may drop to as low as 15 °C), is about one hour.[84]

Also you can't just warm the body back to 38 degrees, it should be carefully brought up AFAIK.


I imagine they were doing that regardless, but were very happy to learn that there was an actual chance of success.


That, in fact, is how I figured it went. I did not get to see the video, so everything remains speculation.


Wow! That's a pretty amazing story. Thank you for sharing.


Thank you! Took me by surprise when my client phoned and said my weather station was on the evening news.


That’s amazing.


Interesting that one my the developers on my projects was Dan Wood.


I'm one of the founders of ODK[1]. It's an open-source offline data collection app that, according to WHO[2], helped eradicate wild polio from Africa. It's become the de-facto app that social impact orgs (e.g., Red Cross, Carter Center) globally use to collect data in the field. It's kinda wild to think about, to be honest.

[1] https://getodk.org

[2] https://www.africakicksoutwildpolio.com/the-top-five-tech-so...


We never met but I had evaluated ODK for the Sharedsolar project at Columbia U for data collection purposes. I used to work alongside mberg and ODk was an indispensable tool for nearly all the health initiatives and surveying. Truly impactful, congratulations!


I love ODK! I've worked with it for a community health volunteer program and contributed a bug fix, even. The impact of ODK goes beyond the software itself, since so many other survey platforms use it as their base. Kudos and thanks from everyone in global health.


I had the privilege of being a contributor to this project.

(Hi Yaw!)


I built the first "post-play" experience for Netflix. It made it so that Netflix would automatically start playing the next episode of the show you are watching after a 15 second count down. We built it in the Silverlight player on the web because it was the fastest way to A/B test new features at the time.

Before post-play, you had to open the episode menu and click on the next episode to play it. We didn't want to do autoplay for a long time because we were afraid people would fall asleep with Netflix playing and it would break the internet. So we included the now infamous "Are you still there?" popup a few minutes into episode 3 with no interaction with the player.

Now it is everywhere - YouTube, Hulu, HBO, etc. And people watch way more TV than they should.


I want to say I hate automatic playing of content after my content is complete but when I really think about it I love it when I want it to do that and hate it when I don't and i'm too lazy to tell my UI which is which.

I guess when something just works your users will assume the cases where it is working properly are just the way things are and the cases where it does something they don't like is your fault.

So well done!


I miss contemplating the content I watch. The attention economy has really perverse incentives. No thinking, only consuming.


I hate, loathe, and despise that fucking thing!

It prevents me from being able to see the credits! Sometimes I want to know who played what part!

I'm okay with an optional prompt that lets me skip the credits if I want to, but that should NEVER be the default!!!


I'd love it if it waited until the video's actually done, like on YouTube.

Note: I work at YouTube :P


> It prevents

It doesn’t…


As I was reading your comment I was thinking "whoa, that sounds like Damien or Robert" and sure enough :)

Hope you are doing well!


Hey Kyle!


This place is bat, haha!


I wonder how much bandwidth this is wasting.

Why not just have a next episode button without auto playing the next episode? Make the autoplay optional and not the default.


I prototyped this as a Java Robots in like 2011 so I could fall asleep to Futurama. I guessed Netflix would take steps to ban it, but later they embraced it.


Ah good old Silverlight. I once wrote a Drag and Drop library in SL. Good times. I miss XAML.


Wrote del.icio.us and invented tagging. Echoes of my original design are still around (notably account urls being web.site/userid)



I'm still a bit traumatized by the pink popularity badge. What was I thinking?

The wordpress suggestion in the response is just A+.


Delicious was awesome, and is dearly missed. Thank you for your contribution.

I have always wondered if it could be scaled to a Google alternative. Ranking pages by how many people have bookmarked them seems like a good alternative to PageRank.


Yes. The experiments at Yahoo showed that bookmarked things were MUCH, MUCH more likely to be highly ranked.


Thank you Josh, and you are not forgotten :-) ! del.icio.us was indeed extraordinary. One thing people forget was how good del.icio.us was as a search engine sometimes returning search results that were superior to that of Google. If you can beat Google at search, even in a limited area, you are on to something. In retrospect, the acquisition by Yahoo could perhaps have gone differently and was maybe not the best outcome for this remarkable technology.


My first web app was a del.icio.us clone, it helped me get started as a web dev. I still write clones of it when testing new web tech. Many thanks!


My social network Touchbase (www.touchbase.id) owes you a great deal for account URL schemas used by online platforms, thank you!


I loved that site man.


I miss it


yo


I created Tiny Flashlight for Android 12 years ago. It's been downloaded almost 500m times. Back then every hardware vendor implemented the camera API in their own way and it wasn't easy to start the camera led. I had to purchase many different devices from different carriers from all around the world just to find out a way to start the camera LED. It was very helpful when the vendor published the kernel source code with the camera drivers for the particular device model. I could send custom commands to the driver to start the LED, where it was not possible using the standard camera API.


How many Android phones does it take to turn on a lightbulb ...sorry.


Around 200 ;)


haha this is sweet, did you monetize it? Did you make your money back from buying all those phones or just consider a tax for making a cool app?


I used Goodle Admob (back in the day it was only Admob) to monetize it. Thanks to the monetization I managed to purchase all of these devices and develop the app further. It was like finding a way to solve a tiny bootstrapping problem.


Most impactful, in order:

1. Was the intern that coded the mechanism to open/close the LIDAR cover on the Mars Phoenix Lander, so it runs on another planet. I also did circuit work, and other tasks for the CSA’s contribution to that mission. That was also the internship where I (re)met my wife.

2. Was on the Android team that brought video to Instagram back in 2013. We brought gyro stabilization to the iPhone, couldn’t quite get it running reliably on Android via the NDK, but I damned well tried.

3. Wrote the first Android app for Instacart.

4. Currently rolling out our new software platform to handle $15B/year revenue for Anheuser-Busch’s supply chain. We have 1000+ companies relying on us to ensure they can order and fulfill products.

Unsure what’s next, but it’ll likely be high impact and fun too.


Can you talk about the AB software? At least in generic terms?


I plan to write a blog series about it. It’s a fascinating story and product, that is still evolving.


First thought was “Without this guy, NO BEER!”. Then I realized “Without this guy, NO Aneuser-Busch beer!” Two different things ;)

Seriously, fun cv :)


PhD?


I might do one when I retire. I did most of my Masters in medical imaging, but left before I wrote my thesis. I want to end my life just being able to say I did it.


In late 2013 I came up with the first memory hard Proof-of-Work puzzle, Cuckoo Cycle [1], based on finding fixed-length cycles in random graphs. Recently, custom chips were developed to solve it more efficiently than GPUs can.

That probably had more impact than the Binary Lambda Calculus language I designed [2] or the logical rules of Go I co-formulated [3].

Computing the number of Go positions [4] or approximating the number of Chess positions [5] had little impact beyond satisfying my intellectual curiosity.

[1] https://github.com/tromp/cuckoo

[2] https://tromp.github.io/cl/cl.html

[3] https://tromp.github.io/go.html

[4] https://tromp.github.io/go/legal.html

[5] https://github.com/tromp/ChessPositionRanking#readme


> the first memory hard Proof-of-Work puzzle

Scrypt is from 2009, per Wikipedia. That's memory hard, and using hashes with some zeroed out bits is a thing done for a long time (Bitcoin 2009; some old meaning of "cryptographic pepper" (fallen out of use) that iirc dates back to the 90s). Am I misunderstanding what you built?


Scrypt [1] is a password based key derivation function (PBKDF), which can be used as a hash function that takes a configurable amount of memory to compute.

The reason it makes a very poor PoW (as choice of hash function in the Hashcash Proof-of-Work) is that the PoW verifier needs as much memory as the PoW prover, whereas a good PoW should be instantly verifiable.

This is why blockchains using scrypt as hash function severely limit the amount of memory used (usually to 128KB). So that verification, while slow, is not horribly slow.

Cuckoo Cycle also requires a configurable amount of memory to solve (subject to certain tradeoffs), but crucially, can be instantly verified with no memory use at all. And thus makes for a good PoW.

In the form of the Cuckatoo32 variant that most mining takes place with, it requires 0.5 GB of SRAM and 0.5 GB of DRAM to solve most efficiently.

[1] https://en.wikipedia.org/wiki/Scrypt


> the PoW verifier needs as much memory as the PoW prover, whereas a good PoW should be instantly verifiable

Ooh, yes I see, that is a big difference. Cool work!


SCrypt was thought to be memory hard. It was not, and making asics for it ended up being pretty trivial.


Scrypt is memory hard. The reason ASICs were easy to make was the small memory requirement chosen to make PoW verification not too slow.


I was a contributor to a little pair of libraries called KHTML and KJS, a HTML renderer and JavaScript interpreter. Joined about a year into the project and while I didn't lay the foundations I helped improve the DOM and JS support a fair bit.

People I respected told me I was wasting my time because Internet Explorer was the de-facto standard and the idea of a new browser engine becoming prominent was fantasy.

Then Apple decided they wanted do a browser and looked around at what open source engines were available they could use as a starting point. Thus was born WebKit [1].

I consistently ignore anyone who tells me I shouldn't try something because it's "too hard" or "nobody will use it". Most of the time they turn out to be right. But not always.

[1] https://marc.info/?l=kfm-devel&m=104197104218786&w=2

Edit: Here's an interesting presentation by Lars Knoll and George Staikos on the history of the project: https://www.youtube.com/watch?v=Tldf1rT0Rn0


> I consistently ignore anyone who tells me I shouldn't try something because it's "too hard" or "nobody will use it".

Inspiring. Thanks for your contribution!


Do you have any insights into how someone should approach a renderer today for HTML and CSS 2.1 rasterization?

Tiled rendering seems to be what all the major renderers use, but the layers of abstraction they utilize to get there are so dense they're unreadable without extensive amounts of time.


When the big earthquake in Nepal happened in 2015, I was working with a volunteer organization called Translators Without Borders to help with translation during relief efforts. Since I was in the USA I could not contribute back physically, so this was the next best thing.

My goal was to help volunteers that were in the field in Nepal communicate in English -> Nepali and back. Even though this was somewhat effective, there was still a communication gap because most people in Nepal in remote parts could not even read in Nepali.

I looked around for solutions but couldn't find any Nepali Text To Speech solutions. The builder brain in me fired up and I decided to build a Nepali Text To Speech engine using some of the groundwork that was laid by Madan Puraskar Pustakalaya (Big Library in Nepal) which they had abandoned halfway.

I spend all night hacking along to build a web app that let the volunteers paste translated text and have it spoken. The result was https://nepalispeech.com/ and the first iteration of this was built in just 13 ish hours.

I hope the people that got affected by the earthquake are in a better situation now.


Hello and thank you!


Wow, weird to think about.

Nothing. I haven’t built anything with a significant impact. I’ve made things that made a significant impact on businesses, but in the scheme of things, nothing exciting.

The thing I made which generated the most revenue was easily the most harmful, and likely the most impactful. Unfortunately. It was an ad exchange that did extremely well. The owners went from random guys with a gross idea to multimillionaires in a couple years. They both spend their days buying up startups.

I should have done better by now. I feel like I need to make up for building that exchange. I was young and had no idea what I was getting into until it was too late.


Yeah, just a few weeks ago I saw an ad for an institutional real-estate investing platform that buys single-family homes. Not every software should be built.


I dunno, I haven’t build anything impactful either, and I got in relatively early (early enough to build Facebook/Google). I imagine that’s true for lots of people.

I have a humongous list of failed stuff though, so much that when I look back I wonder why I couldn’t just stick with any given thing.


You’re not alone. I’ve been doing better in the last few years but when I was just starting out as a web developer set on building a successful SAAS, a lot of them were marketing/advertising related. One of them is a pop-up builder… Sorry!

It still brings in some revenue but I have been intentionally neglecting it for years now, as I personally hate those things with a vengeance. But still, I don’t pull the plug on it.


Co-created Dall-E Mini (now named Craiyon, and I am not involved anymore).

It was among the first text to language models created independently. And it was fully open source.

It also got covered by New York Times in the article covering Dall-E 2 by Cade Metz.

Links:

- GitHub: https://github.com/borisdayma/dalle-mini

- Hugging Face Demo: https://huggingface.co/spaces/flax-community/dalle-mini

- NYT article: https://www.nytimes.com/2022/04/06/technology/openai-images-...

___

(I know this is not as much impactful as others in this thread. But I did this after less than 2 years after transitioning to tech from Physics, and at the age of 22.)


Creating DALL-E Mini at 22 is a huge achievement, especially with less than two years of experience!


I merely worked on the team. It wasn’t my project, per se.


Thanks for making this, it has been very fun for me.


Very glad that you liked it.


I used it and loved it a lot, thank you!


Glad it could give you some fun.


A statistical technique I developed was incorporated into a number of award-winning spam filters, including SpamAssassin.[1]

I'm also apparently the original inventor of the tracking cookie, which had the implication that no one was able to patent it. It was presented in a patent of mine[2] that was about a collaborative filtering technique for recommending ads; I'd come up with the tracking cookie mechanism to support that technique. So, it didn't attempt to patent the tracking cookie separately; but because it was the first publication describing the method, no one else could patent it either. In 2021 a joint legal brief filed by Google and Twitter together, defending themselves against a patent troll, called it "Robinson's Cookie". My patent is owned by Google now. It contained a lot of details for giving users control of the data derived from tracking; that part was pretty much ignored by people implementing it.

[1] https://www.linuxjournal.com/article/6467 [2] https://patents.google.com/patent/US5918014A


The spam filter stuff used Paul Graham's word probabilities described in his seminal article A Plan For Spam. It changed those probabilities a bit, to better account for the number of emails a word appeared in. But in the main, my article was about a statistical method (NOT Bayesian) for combining those probabilities. The word probabilities were Bayesian, but the way of combining them used frequentist statistics. Even so, spam filters that used the technique were always referred to as Bayesian as if nothing frequentist was involved.


Legendary story, we've used your spam filter contributions - thank you!


https://www.inaturalist.org/

While I deserve no credit for its current success, it's been used by millions to:

* catalogue millions of plants and animals around the world

* tagged image data has become critical for computer vision training models

* map species range and impact of various natural changes to biodiversity, with data cited in scientific journals

* new species have been discovered through the app

previous HN thread - https://news.ycombinator.com/item?id=22442479


My wife (and sometimes me) use it. It's awesome!


:heart:


What was your involvement in iNaturalist?


Look for "A Little History": https://www.inaturalist.org/pages/about


I am a heavy user, so thank you.


:) Ty for being part of the community


iNaturalist is wonderful!


:heart:


I implemented various pieces of flight software for the Cargo Dragon space capsule, and to a lesser extent the Falcon 9 rocket. I got to be on a mission control shift for the C2 mission that first berthed with the International Space Station. Some of the software framework I wrote back then ended up getting used in Starlink despite having never worked on it directly, so I can say I have code orbiting the Earth on several thousand satellites.

I've spent the last 6-7 years making autonomous aircraft that deliver medical supplies in various African countries. Probably a hundred thousand deliveries or so have been for emergency blood transfusions, typically for women that hemorrhaged during labor. So that's got to be quite a few lives/families saved!


Created an account for this: would you be willing to share a bit more details about the autonomous aircraft carrier? Is an area I am extremely interested in. Perhaps a link? Many thanks!


Sure, head over to the company website flyzipline.com


The Windows Terminal. It was a long journey to get the console code fairly modernized and maintainable. Another long journey to build a whole new application that could be compatible with the old. And years now of iterating of that original prototype, out in the open.

It's not a perfect application, by any means. But the bar was _so_ low, that I can't help but think of how much we've helped users just over the last few years.


Nice! Thanks for working on it, the "command prompt" is absolutely miserable. I read all the blog posts explaining Windows and its different terminal/shell layers before the new one was added.

I hope it ships with Windows by default one day.



Ah cool! I guess I'll take advantage of that in a couple of years when we're all forced from 10 to the broken UI hell that is 11.


I absolutely love Windows Terminal! Thank you so much for everything that you have done! I love being able to assign shortcuts to things like an SSH console.


Pokemon GO.

It was such a surreal moment to finally leave the office after months of crunch time, walk out into the sunshine for lunch for the first time and see almost every person on the street playing the game.


You personally made Pokemon GO?


Bits of it. I wrote the code to figure out where on the planet all the pokestops and gyms should go, for example[1]. But there were five other backend engineers by the time we launched, plus a bunch of front end people, artists, etc.

[1] To be extra-clear, all code in the game was touched by more than one person, every one of them better engineers than I am.


Your experience needs to be documented for history. Seriously. The people, personalities, the development setups, the day to day creation - all that is of keen interest to millions.


I'll write it up.


If you do, let use know at hn@ycombinator.com and we'll put it in the second-chance pool (https://news.ycombinator.com/pool, explained at https://news.ycombinator.com/item?id=26998308), so it will get a random placement on HN's front page.


Will do! Thanks.


The OpenStreetMap community would be absolutely fascinated to learn what you were doing with OSM data, without a doubt.


I have so much to say here, especially when it comes to OSM vs Google Maps (given that the whole project was originally an offshoot of Google's Geo division) but I also have an NDA that I need to go read carefully. It's definitely not my intent to accidentally break confidentiality.

I do want to say how amazing OSM was. There are SO MANY weird laws in different countries and OSM was a fantastic source of data in many of them. One example is South Korea - there were laws from decades ago that made it very difficult legally to have detailed maps of many parts of south korea - the OSM maps there were far superior to anything else available.


Very cool! Feels like many organisations now have dozens of teams all working on parts of a badly made CRUD app but you guys wrote something people actually want to use that is scaled well enough for people all over the world to play.

That seems a million miles away from everyday agile and crud stuff...


Being an extensive ingress player, i'm quite surprised at this considering this should have been ingress code to begin with. Did pogo not share much code with ingress, just the dataset?


There was very little code shared on the backend. Ingress was appengine talking to custom clients via JSON, PGO was GCE/GKE talking to Unity clients via protobufs. Almost everything on the backend was written specifically for PGO because the Ingress codebase just couldn't scale, at least not cost-efficiently, to the number of users.

Also, Ingress is all about controlling areas of the map, while PGO was mostly based on points of interest, so the architecture needed to be quite different. I'll go into more detail when I post the writeup.


That's interesting! My friends who were avid Ingress players totally repeated the story that at least the geospatial data was somehow reused directly (to generate points of interest automatically based on places that figured into Ingress gameplay). It would be interesting to hear to what extent that was a misconception.


There are quite a few sources of data but yes, the ingress POI data was used as one source.


The POI is indeed reused from Ingress. For a very long time the only way to get a new POI in Pokémon GO is to make a new portal in Ingress and it will sync over.


About 14 years ago - before I'd taken as much as an intro to CS class - I wrote some software that helped a bar keep track of who'd drank what. They were the type of bar where, if you drank every beer they had available, you'd get a free mug. Prior to it being computerized, the staff used index cards in shoeboxes. Lots of the wait staff's time was lost fumbling through those boxes, unsticking them from each other (gross!), etc.

I've since gotten a degree and written software for a handful of companies.

When I think of how many people are actually _using_ my software, though? Fourteen years later, the mug club software is still live in a production environment, used every day by wait staff who turns over every few months. No doubt hundreds - potentially thousands (it got deployed at a few different bars) - of people have interacted directly with it. That code embarrasses me nowadays, but as far as impact goes: that's probably it.


It is amazing how much you can do with code when you only have minimal knowledge and the desire to make something work, before having ideas of "how it should be done" or "how to do it right"


I'm not in hi-tech or engineering. In fact, I'm a photographer who has made a living in the this field starting in 1989. This forum is a wonderful resource for me, so I thought I would mention my particular contribution to the world as I contemplate retirement.

I've traveled North America photographing native bats. This was born from an obsession with documenting creatures that are not easily observed (this goes far beyond bats).

To accomplish the bat project, I built my own high-speed photo systems, designed specialty gear, and developed a processes for capturing extremely detailed images of bats in flight. Others had done it before me, but never shared the technical process. So I had to build it myself. Then I collaborated with biologists and institutions around the country to learn about behavior and more. It was a hell of a journey.

I'm so proud of the project. This work is very hard recreate these days because of a pandemic amongst bats (WNS) and humans (Covid). I think bats are among the most interesting creatures on the planet.

Working with all of these bat biologists, I learned of the holy grail of bats. Its a species that was once considered one of the rarest species in North America. Up until the 90's only a few specimens have ever been observed or documented.

But if you want to see images of the most spectacular bat in North America - the spotted bat - I am in a rare group who has ever seen one much less photographed them.

Some day I'll have to tell the story of Kentucky cave shrimp and how I traveled to the deepest bowels of Mammoth Cave with a crew of 20 - A combined group from the National Park Service and US Fish and Wildlife Service to photograph these tiny and rare shrimp.

Don't get me started on my journey to photograph red tree voles (that only live at the top of mature douglas fir trees).

I'm bragging - yes. I never imagined I could make a six figure income from this work. I expected to be poor. I genuinely hope this work has lasting impact.

Coming from a family rooted in poverty, addiction, and early death - this path has been a surprise beyond description.


Congratulations on working at what you love.

I love bats. As you may know, two species of bat are the only mammals native to New Zealand, where I live. I hope to see one one day!


I spent 11 years working as a contractor for the U.S. State Department. During this time I:

- In 1996 built and deployed a system to keep track of the removal of landmines in Bosnia. In 2015 I met someone who knew my work as a child in Sarajevo, producing the maps they’d give out to schoolchildren.

- I managed a project with over 30 team members to build a system to help former Soviet Union countries manage their import/export control policies.

- I helped create a system for generating some annual reports for Poland that was a requirememnt for them to join NATO.


Never worked for the federal government but my first “real” full time dev job was at a small state government agency and the work I did there had very visible positive effects for people interacting with the agency. Pay was really low though.


I designed and implemented the whole graphic system for the World Cup '98 (working 100h weeks for months). Billions of people have watched in real-time the result of my work and I earned absolutely nothing from it :) (there's a fun story to write about this, the tremendous amount of work, setting up the WAN connecting the SGI machines together, building the remote control hardware, etc).


As a football fan and obsesive, France 98 has a special place in my heart because it was my first world cup as a child (I am from 1990, I wasn't fully aware in USA 94), so please please please write more about this.


Quite insane. If you ever write about it, I'll surely be reading.


World Cup '98 was the bomb! Your work is part of an amazing history.


Please follow up


OK maybe I'll write something and post it somewhere (HN dislikes Medium though).


A long time ago I worked for one of the big medical journal publishing firms. (No, the other one.) I was one of the lead software developers, nominally in charge of the web application that served all of our licensed content to medical professionals and librarians all over the world. I was senior enough at that point that I attended regular planning meetings with the CEO and her team.

We were working on a new product, electronic access to textbooks. I'd built the entire system that takes the textbook XML we got from the content side, created indexes used by our search engine, and made it possible to efficiently display in the web application any text fragment from a full chapter down to a single sentence containing a search result.

The CEO called an emergency meeting: many of our library customers were government funded, and their funding required the library to receive a physical object in exchange for the licensing fee. They didn't want to have to store the physical textbooks and we didn't want the overhead of sending them textbooks. So the team starting talking about creating an entire new subdivision dedicated to the production, management, warehousing, and shipping of CD versions of the books, just so the customers could be given something physical.

I interjected: "If a CD is good enough, I can generate that using everything I've built already. I'm already converting the content to HTML for display in the app, so I can render the textbook out to a folder, one HTML page per chapter, with a table of contents and all of the images, and create an ISO image that the librarians can download using a link in the web application. Let them burn it themselves if they want a physical copy. They could also store the ISO locally so they still have that version if they let their license expire." That was a funding requirement as well.

So that's what we did. It took me a couple of days extra to implement that feature, and I saved the company a fortune compared to what they were considering doing.

I believe I got a $25 Starbucks card as a reward.


That's an annoying part about capitalism.

For example, the guy who invemted the process to create artificial diamondsnfor GE,got a nice plaque and $1.


Probably bsdiff; a few hundred lines of code hacked together over a weekend has saved people over a hundred thousand years of waiting for software updates to download.

Next up is probably scrypt; it would rank higher if cryptocurrencies used it, but instead they use nerfedscrypt which defeats the entire point of scrypt.

Third is probably FreeBSD/EC2. Of course I didn't do all the work for that, but I can certainly claim the status of technical project manager.

My day job, Tarsnap, comes in fourth.


I'd also like to add that your "Cache Missing for Fun and Profit"[1] paper is an excellent read for people who are just getting into Timing Side channels. It's both so crazy (yet so believable) to me that it wasn't taken more seriously by academics when it was published. Quite frankly, the endless commentary from reviewers critizing work for being "incremental" is a big part of why I'm walking away from academia.

I'm currently procrastinating my master's thesis on transient execution attacks, and just re-read it a few weeks ago while drafting my background section. So, thanks a ton for writing one of the most helpful introductory texts on timing side channels!

1: http://css.csail.mit.edu/6.858/2014/readings/ht-cache.pdf


You forgot my favorite cperciva project: spipe


I didn't forget it, and it's one of my favorites too -- but I don't know if it qualifies as the most impactful. People who use it love it, but it's not very widely used compared to, say, bsdiff.


I saved my company ~$18 million with a 100 line perl script that I wrote in order to learn programming.

It parsed a text file containing Jeep parts that needed to be sequenced and printed barcode labels to Zebra printers. One day a construction crew dug up all of our data lines and we lost all comms to Chrysler and our data center.

We had to have a rotation of floor supervisors driving to Chrysler to copy/paste orders onto a floppy disk and bring it back to be processed. We kept the line running for about 30 hours, which basically saved our company because our contract with Chrysler stipulated that we would be charged $10,000 per minute if we stopped the line.


I'm so curious.

So the process was:

* supervisor drives to Chrysler, pastes part orders into text file, saves it a floppy

* floppy returns to your company, you open it up and run the perl script, which prints barcode labels

* ... then what?


The labels printed directly to the warehouse floor where parts were being sequenced. So, for example, the Rear Right Fender Flares printer would print barcodes labels and the person sequencing would pick up the next label in sequence, look at which color fender flare it specified, go pick that part, apply the label and put it in a sequencing rack. When the rack is full, it gets loaded on a truck bound for the plant.


so you kept the just in sequence supply chain of some Chrysler plant alive with a perl script that could import their sequence orders from floppy disks into your system, just as if they had arrived via wire? amazing.

https://en.m.wikipedia.org/wiki/Just_in_sequence


Correct! Although we typically called it "JIT" or "Just in time". The racks get unloaded at the installation point and the buffer is probably 10-15 minutes of parts. That was 22 years ago and I still have the script.


I work in video games and have worked from writing gameplay code all the way up to online infrastructure. It's only been "impactful" culturally, rather than some of the other posts. My top highlights are:

https://www.nytimes.com/2019/07/28/sports/fortnite-world-cup...

I was a programmer working on Fortnite, and I ended up working on the on-site fortnite events, doing everything from the custom cameras and broadcast specific UI, to hooking up the events in-game to the lights in the stadium. It was pretty cool!

https://youtube.com/watch?v=EWANLy9TjRc - I worked on this game (and the demo in this video) for a few years. I wrote much of the code for the asset pipeline for the destruction, lots of the gameplay code for how it interacted with the game and a good chunk of optimisation on the cloud physics side.


Maintained/released the GNU coreutils for the last 10 years https://github.com/coreutils/coreutils/commits?author=pixelb

Designed/Built/Deployed Meta's backend operating system for the last 7 years


> Meta's backend operating system for the last 7 years

Sounds interesting, I had no idea Facebook had their own OS - presumably a Debian derivative?


No, it's completely built from source, and finely tuned to the vast/quite homogeneous hardware, and deployment model. Though yes it is Linux based.


If I can ask do you mean like a customized distribution with what I assume to be a heavily patched kernel?

Or is it just the kernel with a totally new/different userland from a normal “Linux” box?


Totally custom userland, though does reuse common open source components


What was the reasoning for it though? Just seems a huge time sink for questionable gain.


You're not considering the scale.

We have faster iteration than upstream distros. More flexible. More tuned. I won't give exact details but 1% CPU gives extremely significant monetary savings, and there are at least 15% savings from static linking, PGO, LTO, more appropriate `-march`, more appropriate CPU security sharing considerations, ...

Billions of dollars per year, in essentially electricity and required systems savings (considering the scale of serving 3 billion users a day).

Also devs get access to the latest compilers, language levels, and libs, completely independent of distros, who have a more general compat issue to contend with. Considering there are about 30k tech in Meta this value also multiplys up.


Neat. Thanks.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: