Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Most interesting tech you built for just yourself?
1901 points by l2silver on April 28, 2023 | hide | past | favorite | 1796 comments
Maybe you've created your own AR program for wearables that shows the definition of a word when you highlight it IRL, or you've built a personal calendar app for your family to display on a monitor in the kitchen. Whatever it is, I'd love to hear it.



All: This thread has several pages of fabulous comments - to get at them, you need to click 'more' at the bottom of each page, or like this:

https://news.ycombinator.com/item?id=35729232&p=2

https://news.ycombinator.com/item?id=35729232&p=3

https://news.ycombinator.com/item?id=35729232&p=4

One of these years (maybe this year!) we won't need to paginate anymore and scrolling will be blissful again. In the meantime, sorry for the annoyance if you knew this; I just wanted to make sure everyone realizes how large and good this thread is.


My grandmother has dementia. About twice a day, she calls my parents every 5 minutes, forgetting that she just hung up. The calls are always the same: "You live there now. Yes you have money. We came to visit you yesterday." This can go on for an hour or so.

My parents are incredibly patient, but after a couple of these calls, they'll just leave the phone to ring. The soundtrack of the phone constantly ringing in the house, and the guilt associated with not picking up, is unbearable.

My brother and I built a system where her calls get re-routed to a rotation of relatives to answer her calls, to spread the load. After a call with her, each person gets a 2 hour break (customizable). If no one is available to answer, or if everyone is on break, she gets a voicemail that my dad recorded that explains that we love her, that she lives there, all the usual stuff.

It's working beautifully.


I didn’t see in your profile any way to contact you directly, so this comment. Can you send me an email (address in my profile)? I would like to learn more about your setup and challenges you encountered.

I am actually a volunteer at a non-profit in Japan. The NPO provides very similar service using volunteers for elderly people. I have been looking into automating some of the call handling/routing, personalization, and increasing family participation.


Sent you an email :)


"The soundtrack of the phone constantly ringing in the house, and the guilt associated with not picking up, is unbearable."

Hit me right in the feels.

Thanks for sharing pigcat. Beautiful problem solving.


This is beautiful, I wonder if there is a way to make it avaliable to more people. Not even as a business - I just imagine it would help a lot of families in similar situations.


Thanks! My brother and I are quite touched by the reaction in this thread. I will see what I can do about this - if not as a product, then by sharing a little more about what we have done and how it has worked so far


Incredible use of technology! Caregiver overload is so common.


That'll do, pigcat. That'll do.


How did you do this


We use a custom twilio number, some rerouting logic (which is easily configured in twilio), and an api endpoint to determine the next relative that will answer. There is also a minimal frontend to configure things like who is on the roster, their break times, a place to upload a recording, and see call logs.

These are the steps we took:

1. Get a twilio number

2. All incoming phone calls to the home phone are redirected to that twilio number [1]

3. If incoming number == grandma, request from an api endpoint the next relative to dial. Redirect her call to them.

4. If no one is available, play the voice mail

5. If incoming number != grandma, redirect the call to dad's mobile number [2]

[1] A child comment by macNchz noted correctly about "Selective Call Forwarding". This would have simplified the process and we could have skipped step 5, but our telephone provider did not offer it.

[2] Note that this is a bit of a compromise in the setup. The home phone never rings anymore and all non-grandma calls go to dad's cell. But they were happy to accept this.


on android you can use tasker to automate this: http://tasker.wikidot.com/call-forwarding . I think IOS has a shortcut that does the same.


In our case it was a landline, but yeah something like this would work if it was mobile!


There are many ways to do this. Asterisk and a rented VoIP line, a hosted PBX service you just rent and transfer your line to, a 'Web 2.0' variation of the same (like Twilio), or even a programmable desk phone.


Probably twilio


how can you use Twilio to reroute a call to your phone number?


pigcat's brother here :)

In our case, my grandmother always calls my parents home phone number. The phone provider only offers simple call forwarding, so we route all calls to the home phone over to a Twilio number. We have a very simple Twilio Studio flow that routes calls from my grandmother's phone number to the call handler script we created, and all other calls are routed to my dad's cell phone number: https://share.cleanshot.com/ywwhzJ8H

We are a little lucky in that we can forward calls from our landline to a cell phone. If your relative is calling you directly on your cell phone, your carrier would need to offer selective forwarding for this strategy to work


Either route all calls from that callerid to your twilio number, port your own phone number to twilio and route all other calls to your new number, or what I would do: replace my mom's phone with an ATA and setup a dialplan for how to route calls to the child's number.


Yep, that's pretty much spot on :D

I never knew about ATA's! I think would have solved things a little more elegantly. Thanks!


Many phone providers actually offer what they call “Selective Call Forwarding”, which allow you to set up a limited number of basic routing rules to forward calls coming from specific numbers.


Wow how generous you and your brother are. Your family is so incredibly lucky to have you both.


Can't you get AI to answer the calls and have these conversations using your voice?


The prerecorded message seems sufficient for that use case. If I had a relative in that situation, I'd want her to be able to talk to a human - it might be an actual emergency or problem, but even if not...


Right. My brother and I discussed this out of curiosity and you're spot on. Prerecorded message is sufficient, and AI introduces too many wildcards.

But the biggest reason is that ethically, it somehow seems very wrong to trick my grandma that way.

Not sure why parent comment is getting downvoted though, it's certainly an interesting idea.


As my brother said, this is a really tricky area for us to explore for non-technical reasons. We go back and fourth on this, as I do believe my grandmother's quality of life would improve dramatically if this could be done well. For now, we've decided not to explore this, but I think it makes more sense to build personalized AI assistants for people who do not yet have dementia but who are concerned they may in the future


That's a fantastic idea.


But what did you use?


Hi unixhero! We forward all calls to our home number to a Twilio number. We then use Twilio studio to forward calls from my grandmother to a web-based call handler that we created, and all other calls are forwarded to my dad's cell phone. I pasted an image of the twilio flow in another comment if you are curious!


Twilio, ah cool. Seems like something I too could make use of :)


How did you route her calls?


You can use Astrisk (and a PSTN to SIP gateway): https://www.asterisk.org/


It's also possible to use Twilio with their GUI "studio" to create this entire flow. I've used it as call recording system for when I need to record outgoing calls and it's worked wonderfully (and was easy to set up).


This is true problem solving.

Thanks :)


nice work.


If you love her so much why don't you let her live by your side.


I guess hn is known for “why don’t you just” comments but this one really takes the cake.


I love how it's a guy named heroku. So unbelievably apathetic.... like Heroku is to its userbase.


Why not just have the voicemail be the default. Every 2 hours is absolutely ridiculous.


It's his grandmother, not some random person from the street. Not at all ridiculous, especially with larger family sizes.

I would have loved to have that (or even thought up of that) when my grandmother began developing signs of dementia. Fortunately, her signs weren't that bad before she passed away ultimately.


I have a rail line right under my apartment, so I built a small computer vision app running on a Rasperry Pi which records each train passing, and tries to stitch an image of it.

It has a frontend at https://trains.jo-m.ch/.

Edit: it's currently raining and the rain drops are disturbing the images a bit.


Have you considered getting a line scan camera for sharper and higher resolution images? I took some train scans with one: https://daniel.lawrence.lu/photos/

Incidentally I also built some tech for it: https://github.com/dllu/nectar but I need to update the readme...


Thanks for sharing, those photographs are very clear and sharp (especially this one: https://pics.dllu.net/file/dllu-pics/boston-pcc.jpg) it seems to tickle my brain.


I have three of those actually:

https://daniel.lawrence.lu/photos/pcc

Technically, the photo could be twice the resolution, since the length of the line scan sensor is 4096. It consists of two lines, RGRGRG and GBGBGB. By interpolating the red and blue channels, it would be possible to get images 4096 pixels tall. The challenge is that the two green channels apparently have quite different sensitivity and also each pixel has some variation in sensitivity, which also seems to drift with temperature and settings, so it's quite annoying to calibrate everything properly haha.


This reminds me of Wes Anderson movies for some reason.


I'm a big fan of Wes Anderson's aesthetics and would love to shoot that funicular train from Grand Budapest Hotel (which actually exists --- the Buda Hill funicular) using my line scan camera.


His style is to shoot his subjects straight-on. Most other movies have the camera at an angle.


Wow, the pictures look amazing! Yes, the look of line scan images were an inspiration for this project. But of course, I also tried to keep BOM costs down and so ended up with a RP4 + RPi Camera.


The RPi HQ camera is a nice step up from the regular RPi camera while being not too expensive too. Incidentally, I also have a project using that [0] but unfortunately no trains where I live.

[0] https://daniel.lawrence.lu/blog/y2022m01d27/


Have you tried the opposite direction? Sitting on the train with the line feed and taking a picture of outside? Like say, a panorama view of the entire run-length of the line, distorted in proportion to the trains turns and accelerations.


I love how the line scan camera’s horizontal background makes it look like the trains are moving impossibly fast. Not only are the images sharp & high res, it has a great aesthetic and implies you were tracking an action shot.


I remember seeing your photographs on Wikimedia Commons and wondering how you did them - now I know! I always assumed that you just used a very quick shutter with an f-stop of zero :)


I have a huge backlog of photos that I need to contribute to Wikimedia Commons! I'll get around to doing it eventually, hopefully before 2045.


Do it sooner than 2038 and you get the privilege of 32-bit timestamps on the metadata :)


I know he told us how already, but that would have left the background sharp, rather than always the same.


Woah the resolution! You can see the earpods on a person behind the tinted windows of Shinkansen.


How do you get the x scaling right? You have to measure the speed of the train somehow?


When the train is moving at a constant speed, you can just scale the image manually to make it just right. If it's moving at a non-constant speed, you can apply a spline or similar to remove the distortion.


Your blog is a gem, thanks for sharing!


Tip: some more interesting ones (including failed ones) show up if you filter for shortest.

https://trains.jo-m.ch/#/trains/350

https://trains.jo-m.ch/#/trains/3224

https://trains.jo-m.ch/#/trains/4045

Etc


I think the first one is a firefighter train. There is one (the same?) that lives near me.


This is such a cool project! I live right next to a busy road and for a long time have wanted to do something like this that would count the vehicles passing. I've always been curious how many cars pass on a given day and I feel like the hardest part now adays would be getting the right camera angle so if cars are occupying all 3 lanes they aren't counted incorrectly. From there I just need to detect cars as they pass and count them.

It's really cool to see it used like this! The resulting images are really neat as well!


There was a post yesterday about counting traffic on a pi, you might want to check it out: https://nathanrooy.github.io/posts/2019-02-06/raspberry-pi-d...


Thank you for this! It's great!


Given the type of trains that are passing (it seems no IC/IR), along with their precise timing and direction, I'm sure it is easy to figure out where exactly you are living.


Especially in Switzerland where the trains actually go on time :P But anyway does it really matter? It'll still be hard to identify the actual apartment.

Most online webcams are easier to identify


Given an approximate location, even I could do that. You just look at the camera angle.


But the camera image here is heavily processed. Getting an angle out of that looks difficult.


Wonder if we can get a nice photo of an HN reader?


Please don't actually do this without permission. It's bloody terrifying when 4chan does it.


Yes.


No interest in trains, but your website is great - simple, visual and effective.


I wonder if there's open data or an open API for the schedule or location information. That way, you could include information on which train is which.


Yes, the APIs are there, with minute accurate real time data. Would just have to do it ;)


Made me think of Kartrak (an early optical barcode-like system for tracking rolling stock): https://www.youtube.com/watch?v=5K8UpMNYIPo


This is close to what I've always wanted to build; a camera watching the road next to me that records the speed of the vehicle traveling by. I should have everything needed from a simple camera setup, but I've not bothered actually doing it.

Since you have speed, I should dig into this.


I wanted that with noise levels. I'm so very tired of hearing illegally modified exhausts. It seems like an I2S mic would give calibrated levels.


Same here, I have a Pi 3 but I want to have this outside in the balcony, the question that always stops me is how to power it and what camera do I meed?


My plan was to stick the Pi inside, and power both it and the camera with Power over Ethernet (external-rated PoE cameras are a dime-a-dozen on Alibaba and friends).

I even got so far as to get it working with Zoneminder to dump out the clips that had motion, but didn't get further.



I actually have done something similar to this, since I had pretty clear view of the road infront and had my old android phone laying around,

Used to yolov5 to detect vehicles and deepsort to track them, also got a rough estimate for the speed of the pass

heres the two part blog i wrote about it

https://dharisd.github.io/posts/vehicle-monitor/

https://dharisd.github.io/posts/vehicle-monitor-part-two/


Could you show average speed vs car manufacturer? Or vs car type (compact, European luxury, minivan, truck). I've always wondered if this is correlated.


Would be interest; perhaps the red ones ARE faster.


The short ones are fastest.


I live on an airport - Id really like to do this.


https://skybot.cam/ (see also https://github.com/IQTLabs/SkyScan) and might be of interest to you?


Do you mean by an airport?

Or do you litteratur love in or on an airport?


Awesome. Congratulations from a fellow Swiss (and panorama photo dabbler).


And yet again I forgot it’s not the Chinese TLD.

By the way I have a quick expansion for most TLDs and for the Swiss .ch “cheese” sounds rather more apt and easier than the real one in my head :)


This is cool. How do you calculate the speed of the train?

I'm assuming you are measuring how far a certain feature of the train takes to get from one point of the frame to the other. Similar to how police catch people speeding by measuring how long road markings take to pass in a given frame.


Sounds interesting id love to know how you do it. Is the speed calculated based on the noise of the wheels going over a track join? Then you can work out the length/speed based on the time it takes etc. Are the train types/images random or calculated some how?


There is a parameter which tells the program how many pixels there are per meter. From this you can compute the length after stitching. Using framerate, you can compute the speed in the same way.


I love this so much. Amazing use of creativity and tech chops. A++ would trainspot again.


Really cool. They look like model trains! :-)


Yeah! I've never seen trains so clean and modern looking in my life. They look like they came out of a futuristic toy set.


They're swiss trains, I guess we have enough wealth to make sure that our trains are clean. The interior is also almost always clean, except early morning on weekends (drunk people).

From time to time I see a train with graffiti on it, but usually they remove such things very fast.


That's Europe for you. Usable train infra is actually a thing.


It somewhat works but let's not exaggerate how well it works.


Maybe West Europe. Definitely not here in the east.


There are visitors from the east sometimes ;)

https://trains.jo-m.ch/#/trains/2483


Wow that's very cool. The resulting super flat images of the trains is really interesting -- like taking a photo from immense distance


When you say "right under my apartment", where exactly do you mean? Because I also have a train line going underground very near my apartment but it's not directly under. Could I capture such images? And I'm on the 4th floor.


I got confused at first too. What he means is "somewhere outside on the ground level while my apt is not on the ground level, there are trains passing by which I can see from my apt". You need line of sight.

This is not mysterious tech deriving images from sound traveling through the floor. You will be out of luck with your underground subway.


A fun fact is that there is a monorail station built in the middle of a residential building, so it might also be literal.

https://en.m.wikipedia.org/wiki/Liziba_station


If you have the schedule and the sound I'm sure you could make a cam with translucent ground. You should be able to figure out where it is, which model and how long it is. Who knows, maybe the orchestra of break sounds (it has many) is unique enough to spot which it is exactly.


Nice job! I would be really happy if I ever finished my own hobby projects this well.


Sharing this with SBB

I hope they notice (also makes me want to guess the location). I am in Zurich and I hope I don't find this spot


Reminds me of the 90's Lego computer game "Lego Loco"


I'm glad there's at least one other person in the world who remembers that game.


Some cool tech there too - client side sqlite db over wasm, neat! :)


Very neat. What are those power cars (triebkopf)? I thought sbb was only using proper locomotives (cabs at both ends) and EMUs (kiss, twindexx, …).



This is really cool! Thanks for creating and sharing!


Taking trainspotting to a new level, congratulations


This is so cool!


Curious to know how you manage to concentrate during waking hours on work and how you sleep peacefully?


There's a lot of trains in Switzerland so there's a lot of apartments by tracks. For the most part, the apartments are just built well.

Plus, the trains and tracks are very well maintained, so they create a lot less noise than you may be used to.


Very cool, how does the stitching work?


Wow this is cool as hell!!


I think this might be my favorite. Wonderful idea and execution!


How accurate is the speed and length measurement?


Very nice! Pleasantly surprised to see the SBB logo <3


A hybrid between area and line scan - block scan camera?


What’s up with the duplicated cars at the top?


The camera is pointing at the car. The train is moving past the car. The images of the whole train are made by stitching together lots of photos, all containing the bit of the train in front of the car as it moves past it.


From what I can see, they're not actually duplicated, I would suggest taking a closer look at the windows. But I do agree that it's quite hard to see the difference.

The trains look very clean from the outside. I do wonder how loud is it, to live so near the tracks.


They are also very clean from the inside :).

I also live right next to a train line in CH (that has exactly the same kind of rolling stock passing by as the ones captured by jo-m). These are modern commuter trains (no cargo and long distance trains), and are a lot less loud than you'd expect. A somewhat busy street nearby would be an order of magnitude more annoying.


This is really impressive. Very nice work


Cool project - thanks for sharing!


This is incredible. Congrats.


This is awesome. Really nice!


these pics look great. like some big model trail set catalog.


This is really sick!


This is so cool!!


This is amazing


Amazing photos


that is very neat, thanks for sharing!


So RAD !


That's amazing. Very cool.


wow that is so cool!


this is so cool :D


Cool! :D


I have a "TV channel" app running on a Raspberry Pi serving up local video content to a schedule I create.

The Pi has a 5TB hard drive attached with perhaps 1000 videos or so. The app has a schedule and plays the videos according to the schedule. It starts up in the morning, plays tele-courses, moves on to old TV shows, an afternoon movie, after school shows begin around 3:00 PM, a comedy show around dinner time, an evening movie, some late-night content, then the Indian head and "We Will Resume Broadcasting Tomorrow Morning...."

It fills dead airtime by choosing randomly among (literally) thousands of YouTube short clips I have on the drive — or showing a title card indicating when the next show begins.

Partly it's a fantasy — to have my own "channel" with my own scheduled content — my fantasy station.

Partly it serves to put on content I would otherwise not be inclined to pull up, double click and watch. It adds the serendipitous element to TV watching that I miss before streaming. The movie "Charly" (1968) just came on last night and I am sure I have not seen it since I was a teenager — had to stop what I was doing and watch a few scenes I recall vividly.

Today's lineup here: https://engineersneedart.com/UHF/

(Since the schedule is in JSON format, it was easy enough to make a web front end to display today's schedule.)


Ha! The Final Sacrifice is on tonight. That's one of the very best MST3K episodes.

I would love to do something like this for my kids. They're constantly begging to watch Youtube, which I limit pretty heavily. Something like this could allow me to stick some pre-approved videos into a queue, and maybe even make an allowance for a half-hour of some of the ... dumber stuff that they like at a certain time of the day. I could also slip in some Kurzgesagt, Mark Rober, content that they may not otherwise be that interested in to surreptitiously educate them ;)


For those interested in doing something similar there's a Plex add-on for making custom TV channels:

https://github.com/vexorian/dizquetv

Personally I want almost this. I want to rotate the TV shows my kids watch in the morning but I don't want to start part way through a show (the one part of the old analogue experience that I don't miss at all). Difficult to square that circle.


Your comment led me to learn about Plex Plugins even though I've been using Plex for literally a decade!

Unfortunately they're removing support for all Plugins over time and have already eliminated ones that play content.

https://support.plex.tv/articles/categories/online-media-sou...


God. I wish they would stop shagging about with it content stuff (I don’t want their content, ever).

I would like a more logical arrangement of settings, remote setting of user resolutions (eg, please try native!).


Try out Jellyfin! It's an open source Plex replacement that doesn't mess with its users.

I switched a year ago after some dumb monetization change at Plex annoyed me enough and it's just been so much better.

Imagine a product just like Plex, but without all the shit Plex tries to pull all the time!


Do you ever need to access your library remotely or from iOS devices?

That's the one killer app for Plex for me. I can get to my library when I'm on the road (music more than movies or TV).

Does Jellyfin do anything like this? Any other caveats if you've used both?


I'm using a seedbox and I can access it from anywhere, just fine. I'll be honest, I'm not sure about the implementation details, but I'm pretty sure it's done via a reverse proxy like so: https://jellyfin.org/docs/general/networking/caddy/


Thanks for sharing! I'm reading the docs but was curious to also hear some first-hand experiences..


Wow. No idea this existed. I was about to set up a TV tuner with my Plex and pay $4.99 for the ability to broadcast live TV over Plex but see that JellyFin has this feature for free.

THANK YOU!


I like the idea of Jellyfin, but I had a lot of performance issues with Jellyfin and ended up having to switch back to Plex.


What kind of performance issues? I haven't experienced anything like that personally


HDMI seems to be two way, my ps5 turns on automatically when I turn my TV on.

So you should be able to do something with that.


The protocol you're looking for is HDMI-CEC! Not a ton of good documentation out there, but hopefully this helps send you down a good path.


Oh, it's definitely possible. Software like dizquetv it must know when a new connection is made. But to add such a feature would require a lot of familiarization with their codebase and I don't currently have the time.

One day...


All this is missing is an RF modulator and a very-low-power transmitter, just enough to reach throughout the house...


Do houses no longer come with coax anymore? I broadcast to my house on an injected channel using an HDMI-to-RF modulator[0]

[0] https://www.provideoinstruments.com/hdmi-to-rf-modulator-con...


This reminds me of the channels gamers get in Ready Player One. The main character used his channel to broadcast his favorite T.V. shows that other people could tune in and watch. This is a really cool!


If it were public, it would sadly fall afoul of copyright laws. Which is an absolute shame. Netflix and friends should definitely find a way to make this exist.


I did this same sort of thing. My impetus was that I have tons of shows and movies to watch but I 1) don't necessarily want to binge every episode back to back and 2) my wide selection leads to choice paralysis. I mostly want some background noise rather than something I'm super engaged in.

I wrote a script to catalog all my shows/movies then another that reads a schedule and generates a daily playlist. My schedule has daily episodes of some shows and then weekly showings of others. I even put some network block bumpers between some shows and "upcoming schedule" clips.

The output of the scheduling script is just an m3u playlist. A cron job loads the day's playlist at midnight and it plays continuously during the day. There's no controls to pause or anything, if I miss something I miss it (by design). All the video content is stored on a 5TB drive plugged into the machine.

To complete the old school analog nature of the project I picked up a low power Hlly VHF video transmitter. I've got a small CRT TV in my office that I use during the day and I can pick up the signal on the TV in the living room. The project started on an RPi with VLC but it struggled on some videos I'd ripped from Bluray so I replaced it with a little fanless AMD box with an HDMI-RCA adapter. It sits in the garage and I can pick up the signal anywhere in the house.

The best part is apart from the setup it's proven to be pretty reliable. My next step is to make a schedule output like what you linked and maybe a web based UI to let me "change channels". For right now it does what I want with no real fuss and I always have something on that I like.


That is super cool!

I’ve wanted to do the m3u playlist thing for a long time, so I could create a HLS stream for each “channel”. Then family members could watch from any device.


What a great idea! Are you inclined to make a guide? If so, my old-school wife and myself would be grateful.

Otherwise, I will enjoy the fun of figuring it out for myself some day.


I'll open-source it when I get the embarrassing bugs worked out.


I encourage you not to be embarrassed and to simply opensource it. To err is human; anyone giving you grief because of bugs doesn't deserve the effort you've put in. And opening it now could actually bring assistance in getting those bugs fixed, while simultaneously benefitting everyone who wants to do something similar but isn't sure where to start


I won't speak for anyone else but sometimes "bugs" in more about process than code. I have a similar project as the GP and am not currently interested in open sourcing the project because there's a lot of bespoke elements and manual setup process. I don't want to have to make a README describing all the process steps that make my code actually useful.

For me, on my hardware, on my network, I've got a process that works. It's a non-zero amount of effort to generalize the description of that process.


I have been thinking about doing exactly this for Saturday morning cartoons to stream anime to my PVM once I can figure out how to stream 480i from a modern device to RGB.

Would consider sharing how you set it up? I’d love to do something similar!


I wrote the app in Python for the Raspberry Pi. For video playback I am using the (now deprecated) omx player.

I tried using VLC instead for video playback (I think the more accepted way to play video from Python now) but when VLC completes showing a video there is a visible flash that I cannot figure out how to get rid of.

I should point out though that it doesn't "stream" — you'll have to find some other solution for that. The Pi is a dedicated "player" hooked to a dedicated TV that is always on, always showing what the Pi has to offer up.


You might look at mpv instead of VLC. I had the same visible flash problem with VLC and mplayer but not with mpv. The other benefit of mpv I just (as in two days ago) found was I can use a loudness normalization audio filter to keep some shows from having blaring audio.

On my system I'm running mpv on top of OpenBox with compton for the compositor. It's been much smoother all around than VLC or mplayer on the same hardware (an AMD mini PC now replacing an RPi I had been using).


Thank you! I will do that.


VLC has a setting to pause videos when they reach the end, instead of stopping them. Depending on how your system is set up, this might prevent the flashing.


Very cool. This is exactly what I miss about old time TV; being able to catch a show by chance. I find it interesting that most of the time there's so much choice that I can't get the energy to pick one and stick with it. For a while there, I had 4 streaming services and never watched any of them. I just wasted my money.


I did the same thing with my Chromecast, I made it play a random episode from my library, one after the other, so there was always something I liked on.


I've wanted to do this for quite some time! Do you serve it over you local network or is the Pi directly connected to the television?


That’s great! Like the idea of bringing serendipitous timing back. I see ‘Sounder’ is coming up, I recently got that soundtrack on vinyl!


Engineer Sneed Art?


Look who is out of the loop on Sneed Art


Formerly Chuck Art


Engineers nèè Dart


Engineers Need Art ^^


Engineers Need Art


Blind developer here; I often write tools for myself to perform some task that is not well supported by my screenreader. For example:

* I wrote an add-on that allows me to read HN comments in a structured way. A typical screenreader would present page in a linear manner, so you'd have to read all replies in order, which is quite tedious in popular posts. My add-on parses the page and identifies the level of each comment, and then I can navigate to previous/next comment at any level. So I can quickly check top-level comments and then read replies only if I'm interested.

* Another add-on makes Jupyter edit boxes to work with my screenreader. Jupyter was requiered at my company , so I either had to write that add-on or else. The way it works is that it sends Control+C Control+V keystrokes to the browser to retrieve contents and then presents them to me in an accessible window for editing. When I'm done it would Control+A Control+V new content back to edit box.

* BlindCompass - iOS app that I wrote for myself to navigate on the streets. One of the problems of blind people is that it is easy to lose the sense of heading, e.g. where is north vs South. So BlindCompass would read my heading and present it as a two-pitch sound, that allows me to deduce rough direction. It's also easy to figure out the right direction and just maintain it, so with BlindCompass I can cross large open spaces easily.


BlindCompass sounds (pun not intended) brilliant! Did you have inspiration for this or was it an original idea? Not any less impressive either way, just curious as someone who’s not at all family with this space.


I got the idea when I was learning to cross a wide street with a white cane. three lanes in each direction - and it proved to be a challenge because I would veer left or right and frequently get confused and lost. Then I thought a compass would be helpful, but a quick survey of compass apps on iOS showed that they are either visual, or show your heading as a number that can be read by VoiceOver, but it is still not very practical. So I thought that I need to encode heading as something that my brain can easily decipher during crossing the street. I have prior musical training, so that's why I decided to encode heading as a musical interval. This allows my app to communicate with about 10 degrees precision and in practice this is well enough to go on a straight line for long enough to cross the street.


I see someones cane get hit by a car one time causing him to lose orientation and make an almost 90 degree turn in the middle of the road. If the cane is pointed towards or away from traffic people don't even stop anymore.

This was actually long ago but it stuck with me. Just now I realized what perfect analogy for life it is. You move towards something, something interrupts the journey and then you just continue, thinking you are moving in the same direction.


Reminds me of season 5 of person of interest where the super intelligent AI known as "the machine" is giving relative directions to a character using ascending tones for right and descending for left, or something like that. The other character preferred directions in positions on an analog clock.


Haha, this was my immediate thought as well! That was awesome. Root is a badass.

As the sibling says this was the season 2 finale. There were other similar instances.

S02E22 "God Mode": https://www.youtube.com/watch?v=cHIo96yBf70 (3m54s)


That was actually season 2! The final episode, I just watched it last week on a rewatch of the show. A lot of it feels pretty prescient right now, although not quite as intense as when the Snowden leaks happened and confirmed all of the government illegality of the show but none of the AI lol.


Reminded me of this: https://sensebridge.net/projects/northpaw/. Saw this linked on HN somewhere back in 2012 or so.


Reminds me of the train puzzle in Myst!


I'm not blind but I wrote an EPUB to Text-To-Speech reader using Coqui (a really good AI TTS project). There are books I wanted to listen to while doing other things, and I couldn't find audio-book versions of them, so this worked out perfectly. It could be that I did not do enough searching, but I was surprised I didn't see anything out there that already worked this way.

https://github.com/aedocw/epub2tts


BlindCompass is neat! My child told me that she had learned about the existence of compass implants and wants to get one. Would that be useful from your perspective?


Very interesting idea. I think it depends on precision with which that implant can communicate heading. If precision is 10 degrees or less, this can actually be extremely useful to visually impaired people.


Implants? Sounds too hardcore. But I remember some people talking about compass anklets; you put one on your leg and it gives a signal (vibration?) on the north side. They said it can greatly improve your orientation on a hike, even if you do not pay conscious attention to the anklet.

(This happened years ago, I do not remember more details.)


Is this it? https://apps.apple.com/ca/app/blind-compass/id1546647415

I’m not blind but I have terrible time with directions and navigation. I’m gonna try it next time I have to stick to a general heading.


This is one of the coolest things I've ever read. If you don't mind me asking, I'm really interested in how you read, write and edit code. What tools do you use? What's your typical workflow like?


Those are pretty cool screen reader add-ons. Do you use JAWS, NVDA, or something else?


I use NVDA. Here is that add-on if you're interested: https://github.com/mltony/nvda-browser-nav/


I built myself an automated hydroponic grow tent.

It measures and corrects pH, electrical conductivity, oxidation reduction potential, temperature of the air and water, water level, and humidity. It also automates pumps, lights, and fans (I know people normally advise against this). None of it is particularly sophisticated, but I’m really proud of it.

I initially used a deep water culture and later moved on to the nutrient film technique. It produces a lot of greens and herbs — way more than I ever expected — and it’s remarkably hands off. I recently left it to do its thing for almost 3 months before I had to intervene, and the problem wasn’t the water, nutrients, or the system failing explicitly. The plants just got too big for their channels and as they became stressed, they developed some pest issues. It was such a cool and empowering experience to see real world automation Just Work.

The whole thing is powered by an Arduino Nano RP2040 Connect. It’s a great little controller.

I’m currently designing my first PCB to consolidate the system onto a single board so my friends can easily build their own. It’s not extremely cheap, but it’s not too expensive either and you get a tremendous amount of food from it. It’s such a fun hobby.


I’m very interested, is it possible for you to open-source it? Also, what are its absolute dependencies? Does it depend on daylight? Fresh air from outside? Stored chemicals? Is water/air recyled? What is the reason behind you making this? I’m preparing for Collapse and want to do such a thing soon. If you can open-source it, it would be very cool and helpful.


The main resource that you need in steady state is always electricity. For light, heat, pumps. And a lot of it, especially in winter time. The rest is mostly closed loop.

If you are preparing for Collapse, ensure you have multiple independent sources of electricity available. Solar, hydro, wind. If you are in a city you are better off with a storage room full of canned food since your hydroponic plants wont give you much food after power goes out.


Thanks. I don't want to be anywhere near any population centre, the biggest threat by far is masses of hungry people imo. I'll build an open-source food production system with only dependency being sunlight later this year inshaAllah, everything else will be closed loop. I don't think any amount of canned food will cut it either, this is Collapse of the entire civilization we are talking about. Maybe 10 years of canned food will let me and my family ride out the initial population decline, but there's always the possibility that armed groups will take it, not to mention I can't not share it with those in need (read: everybody) as a muslim.


I can’t not share with others, either. To me a collapse means bringing everything I’ve got to the table for my family and community — no hoarding, no hiding. I really believe these skills (hydroponics, basic automation, growing food) will matter if something happens.

People who can help and provide actually stepping up to support less able people is the best thing we can use to prevent real catastrophe, I think.


> to prevent real catastrophe

I guess we'll agree to disagree on this, I don't think we can prevent real catasthrophe. We can prevent some (relatively small) amount of suffering though, and all that matters is we do the best we can. A mind-opening must-read imo article on Collapse: https://medium.com/@CollapseSurvival/overshoot-why-its-alrea...


> is it possible for you to open-source it?

Absolutely, I’d like to work towards contributing useful things to open source gardening technology. Once I have something useful to put into the world I definitely will.

> Does it depend on daylight?

No, this particular system is totally isolated apart from fresh air pulled in to regulate humidity and temperature. The lights are the most expensive aspect of the system by a wide margin, but they do work extremely effectively. The plants are very happy.

I have an older iteration of the system working in an outdoor greenhouse without artificial lighting. It uses fans to regulate air temperature and humidity, but it gets light from the sun. It’s doing fine so far, but the temperature is still relatively lower so growth is slower than in the tent. I’m excited to see what the results are like over summer.

> Is water/air recycled?

This is an interesting question because at the moment the answer is no, but I have the beginnings of plans to recycle the water. I use a reverse osmosis system to feed into the system gradually (this ensures my water sensors provide reliable readings), and I’m fairly sure I could add a secondary tank to drain old solution into, filter it, and use it as the source for the feed into the RO system and then back into the active tank. Though it’s not necessary now, I think that level of efficiency could be incredible.

I’d love to be recycling nutrients as well. I know there’s plenty leftover when a grow is done, but I can’t know what the ratio of each nutrient is in order to rebalance it for the crop I’m growing. I’m sure some growers are able to do this, but I have a feeling it’s a bit beyond me. It seems like a job for a mass spectrometer. That’s possible to DIY in a sane price range, but I will likely need to wait until my kids move out to take that on. I do love the idea in any case — utilizing all of the nutrients and reusing them when possible would be a major accomplishment for me.

> What is the reason behind you making this?

There are several reasons. One, I eat a lot of greens and they’re getting more expensive. I kept a sheet in Soulver (a sort of natural language math program) which outlines a cost breakdown of a head of lettuce grown hydroponically vs from a store. It eventually hit a point where I could grow it for less than I could buy it for, and it justified jumping in and making it happen. My ROI has worked out fine, so the sheet was correct and it’s not crazy to grow with a system like this (so long as you don’t mind the maintenance, harvesting, trial and error, etc). It has actually worked in favour of growing it myself quite a bit more since I first started and hasn’t shown signs of tilting the other way for a while now.

Two, I love learning. The more I learn the more reasons I find to be in awe of the world. Seeing the way the plants grow, understanding the chemistry and biology of the system, accomplishing new things with technology — I find it incredibly fulfilling. It shows my kids that the distance between here and making something interesting or useful happen is simply doing it. First we had an idea, then we had real plants growing almost magically in a system built from scratch. All of that is awesome.

Three, like you I see some instability in the world and I want to have a grasp on how I might ease tasks like finding reliable nutrition. I have bags of fertilizer because they’re not terribly expensive and they can help generate good nutrition quickly, easily, and very reliably. Something like the kratky method can actually work really well even without stable power, so long as light and temperature are reasonable. I also have a lot of seeds for sprouting, as they’re an incredible source of easy nutrition in emergencies too. I don’t really want to need these skills for that, but I do want to have practical skills for producing as much food as possible if something were to happen.

As far as open sourcing goes, I hope to get a sense for how easy or difficult it is to get up and running with this stuff once I can get it in my friends’ hands. I plan to add a crude web interface for managing environment and automation parameters, and I’d like to figure out a way to sensibly scale out the system. For example, not everyone I’ll be giving it to cares about pH or EC, so they don’t need those components. I could simply not solder things onto their boards, but I’d rather figure out something like using standoffs to join the boards in a stack and gradually add features that way. Kind of like hats on an arduino I guess.

As I iron this stuff out I definitely want to put it out in the world. At the moment there are so many superior options in ecosystems like raspberry pi, I feel like I’d be wasting people’s time. I do think a pi is overkill (though potentially complimentary) for this kind of thing though, and the power of a connected microcontroller with MQTT and simple RPC services is way beyond what most people expect.


Thank you very much! I'll start building a Biosphere 2-like [1] system for climate and weather independent food production later this year inshaAllah. So I'm interested in such projects. I also want to open-source mine because I can't see how I can accomplish it all by myself.

Input: sunlight and (optionally, if not contaminated by nuclear fallout) air. Output: meat, dairy and crops.

> I have an older iteration of the system working in an outdoor greenhouse without artificial lighting.

I think this should be possible with such a system, with full composting (including every biological waste, even human excrement), air-to-water devices, underground temperature and solar utilizing temperature control system and many more things, all in one massive greenhouse. I don't think hydroponics is the answer as it is highly fragile (system exists in sensitive equilibrum) or require chemicals (requiring global supply chain) or depend on complicated tech (many failure vectors).

[1] https://en.wikipedia.org/wiki/Biosphere_2


You’re absolutely right — hydroponics is a dead end if you’re mostly or entirely off grid with no supply chain.

I’m very interested in soil-based gardening and cyclical systems too, but I don’t have enough space to really experiment. However, if I did have the space, I’d love to explore an aquaponic system in a large enclosed area. Perhaps a large geodesic dome partly dug into the earth to help stabilize temperature. The ability to generate a nutrient solution from a biological system is absolutely inedible to me.

Thanks for the link to Biosphere 2, this is really interesting.


Thanks for the confirmation on hydroponics. On human manure, Humanure Handbook: https://humanurehandbook.com/


Would love to see a blog post on this or something!


Not OP and not OP's project, but I saw a fantastic automated hydroponic project on YT a few years back that is very similar. YT: [0] Blog Post [1] GitHub for the environmental control system [2]

[0] https://www.youtube.com/watch?v=nyqykZK2Ev4

[1] https://kylegabriel.com/projects/2020/06/automated-hydroponi...

[2] https://github.com/kizniche/Mycodo


I built a system like Kyle Gabriel's (using his tutorial) and I grow mushrooms with it in a small tent, running Mycodo on a Raspi. This has probably been my most interesting tech I built just for myself, and my sanity. But credit where it's due: thanks Kyle!


Kyle is amazing. He’s still very active and supporting people with Mycodo. I learned a lot from him.


One of these days I would really like to.

It’s funny, I used to write all the time and I loved it. I think I became a bit too critical of myself as I saw my site getting more traffic, and I got a bit too anxious to hit publish.

I should get back to it. I’ve been working on a visual editor which generates code you can flash on an arduino with the idea that eventually this could evolve into a little automated gardening platform, but I’m not fully convinced it would be received the way I hope it would. I know a lot of people are into automated gardening, but they might not be the people who would want to use this kind of platform. In any case, I might find out faster if I write about it and see what people think. My friends are certainly into it, but, they’re my friends! Haha. I need some strangers to laugh at my ideas, maybe.


Same!


Same here.


Same!


Same!


Same!


Same!


Same!


Same, please!


Same


Same


Same


I am intrigued and would like to subscribe to your newsletter


Can you please share a list of the sensors you use? I am very interested in this.


I’ve been collecting them over time so I don’t have everything handy, but here are some:

pH: https://www.dfrobot.com/product-2069.html

EC: https://www.dfrobot.com/product-2565.html

Water temperature: https://www.adafruit.com/product/381

CO2: https://www.adafruit.com/product/5190

Air temp and humidity: https://www.adafruit.com/product/3251

There are others but I’ll have to dig into it! I think you could spend less on alternatives, too.


thank you! i am also super interested in something like this


You should absolutely go for it. Start even simpler and with lower investment if you want, too. These systems are easy to get running and you can gradually add sensors and automation as you go.

I wish I started a lot earlier, but I was constantly trying to have the right stuff, or enough stuff to get started with the perfect setup. It turns out that makes no sense. You’re going to make mistakes, learn stuff, figure out what you like and don’t like, etc.

Starting with a bare bones setup using NFT, not even in a grow tent necessarily, you’ll figure out really quickly what you want to do with it and how to move forward.

Something I also didn’t really understand or consider is how easy it is to add sensors or update firmware gradually. Each of the sensors I use is useful independently or together; it’s totally fine to start with just one. Though most important is arguably water and air temperature; you’ll use those to accurately adjust other sensor readings, and in the short term, they’re immediately critical to plant health.

I’ve got a small system running on my old deep water culture equipment in my outdoor greenhouse, and I actually check its pH with plain old pH testing drops, a vial, and a card with the colours to match against. It works totally fine. While it won’t teach you about automation, it’ll get you familiar with how your system responds to different conditions, what the pH tends to do with the plants you’re growing, and so on. This is all invaluable and I wish I knew it before I started automating. I would have written better code from the beginning.


How practical is it to do any of this in an apartment, and in a living space? I’m very interested but don’t have any space outside the apartment I live in.


Totally practical in my opinion. I used to have a small tent in a closet, and while it needs ventilation, decent air and water pumps are not that loud at all these days. Some people even build cabinets to hold their systems, so they can go in a laundry room or similar space and be easier to access and vent:

https://youtu.be/EAzsdVAjTWU


> It produces a lot of greens and herbs

If that were NL at this point your whole audience would be on the floor laughing. 'Suuuure...'. What some people won't do to get decent tomatoes.


Haha, I forget sometimes that I’ve totally normalized growing greens and other people associate it with cannabis. I’ve had a couple people come into my workshop and end up looking suspiciously at the grow tent humming along on the corner. When they see that it’s actually just lettuce I think they’re kind of surprised.


Np, I just thought it was very funny. I've had a similar thing here where I ordered 500 ziplock backs and the guy on the other side goes '5 gram or 25 gram'? So I asked why the bags are so heavy and hilarity ensued. I needed them as parts bags for Lego... but it turns out they almost exclusively sell to gardeners.


Likewise! You should have seen the look on the landlord's face before I opened it up to show them the mess of cucumber plants in space buckets (sidenote: do not grow vining plants in a grow tent!)


You joke, but I have heard radio ads for hydroponic supplies in Canada, which very much had the tone of "wink, wink y'know for your veggie garden".

There was even a chuckling group of people in the background when they mentioned "veggies". This was in Toronto around 2011.


Hey don't judge my pursuit of dank tomatoes


Fingers crossed for full legalization in Minnesota today. Including grow your own!


Tomatoes are already legal i've no idea what you're talking about.


Tomacco!


Not from NL, but already started smiling at "hydroponic" :).


Sorry for my ignorance, but is NL Netherlands? Also, could you give me some more context on why the people would be laughing?


NL is meant as Netherlands here and the context is that since you can't talk about growing cannabis in the open, people talk about their "vegetable" gardens or "herbs" instead. The comment you're responding to is implying that the grow tent is used to grow cannabis but he's covering it up by saying it's a vegetable garden.


Almost: if it were NL I would imply that it is to grow cannabis but since the OP is obviously 100% sincere I don't doubt they're doing the legit thing.


How do you keep water touching sensors working long term? I tried similar sensors but they all get rusted / oxidized to work properly after a certain time.


not OP, but one of the tricks is to activate the sensors only when measuring, so there's no constant DC applied to the sensor wires/pads. once you have that, reduce measurement frequency, so to mainimise the time when voltage is applied to the sensors. for example once every hour for moisture is sufficient, and 1/sec isn't really going to help much.


Would alternating the polarity work?


Off-topic but perhaps interesting:

That's what they do when performing catheter ablation (a medical procedure for curing cardiac fibrilation by destroying minute parts of muscle with electric current).

DC would work just as fine on this procedure, but due to electrolysis of water, oxygen and hydrogen bubbles would form, which could get stuck somewhere. Using a square wave AC quickly reverses the reaction every period, like you suggested for the moisture meter.

https://en.wikipedia.org/wiki/Catheter_ablation#Technique

I don't know the answer to your question, but it would be worth trying.


These sensors are designed to withstand contact with water and to minimize hydrolysis, and I haven’t had issues with that so far. I’ve been running this system for close to a year and they still seem to calibrate just fine.


You can use capacitive water sensors taped to the outside of non-capacitive containers (aluminum foil, a resistor, an arduino, and a plastic 5 gallon container), but honestly all you need are DNI timers to "automate" any grow operation. Put your lights and pumps on a schedule and there is absolutely no reason to get more creative. If you do anything besides low-level timers you're making it complicated and brittle with no added benefit.


You need industrial level sensors and the water needs to be flowing constantly through them. I built something similar about 15 years ago and tested many sensors. In the end I had to pay about 1000 dollars for ph and ec meters that did the job reliably. To be honest there is nothing new here. This is how big greenhouses have been operating for decades.

In small scale there is more work maintaining the automated setup and calibrating the sensors than it would take to do the measurements and dosing manually.


I think it can cost quite a bit less now, but you’re right — it isn’t cheap.


I've seen a clever setup with the sensors in a dry container above the water tank. There is a hole in the bottom. Before testing, a pump fills the container up with the tank water, flooding the sensor probes. When the pump stops, the water drains back out into the tank.


You’d need to wash the sensors and return their caps with protective fluids. It would be totally possible to automate, but perhaps the same overall cost as buying industrial grade sensors which can handle long term submersion.

You’d also need to ensure the caps contained enough storage solution at the right concentration. Over time the probes would introduce drops of nutrient solution (unless you rinsed them with distilled water, in which case you’d dilute the storage solution), and you’d need to replenish it.


The ph sensor will die fast if it the membrane is kept dry.


I'm assuming you have several tanks with ph+ and ph- solutions? Are you using off the shelf ph sensors? How about EC?


That’s right, I’ve dissolved sulphur and potassium bicarbonate into separate containers, and peristaltic pumps dose a small amount every 15 minutes when the nutrient solution goes beyond the acceptable parameters for an hour. 15 minutes is enough time for one dose to register on a read of the pH level so that it doesn’t go too far.

As for EC, I can only correct it if it’s too low. If it’s 100 points below where I want it, I dose from two containers of pre mixed nutrient concentrate. They’re in separate containers because they’ll actually precipitate some of their constituents if they’re combined at high concentrations, which is too bad (it would be nice to use only one container).

The pH sensor I use is apparently lab grade, but only cost around $70 CAD. It has been holding up just fine for close to a year now. If I were doing this on a larger scale, I think I’d go for one that’s a bit more expensive from atlas scientific. They seem to stand by their products and claim their pH probes will operate for years if taken care of.

My EC sensor was quite a bit more — something like $150. I forget where I got it, because I had the idea to build this maybe 10 years ago and that was one of the first components I picked up! Looking around it seems like you can spend quite a bit less now, and it seems like they’re durable.


Do you know of a good source of information on how to recognize any plant's nutrient deficiencies accurately?


Check out Bunnie Huang's post from Covid lockdown:

https://www.bunniestudios.com/blog/?p=6481


This is awesome, thank you!


Unfortunately no, there’s a lot of misinformation everywhere I look. I try to record my own experiences and stay on top of tracking results so I can know what helps under which conditions. Hydro seems to have mostly eliminated those concerns for me, though my outside garden still runs into all kinds of problems that are tricky to diagnose.


Do you use fish as well to balance the system or you do it directly using the right chemicals?


I’d love to try using fish some day. I use some buffers I mixed from sulphur and potassium bicarbonate. I get them to an approximate pH and then let the system measure gradually as small amounts are dosed into the system.


Have you checked out flux.ai for PCB design?


Yes actually, that’s what I’m using to learn along with YouTube. I tried other software, but flux kind of hits a sweet spot for me.


Amazing idea. Saved for later.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: