What are you working on? Any new ideas that you're thinking about?
A tree cutting tool.
Take photos of the tree from 6 different angles, feed into a 3D model generator, erode the model and generate a 3D graph representation of the tree.
The tool suggests which cuts to make and where, given a restricted fall path (e.g. constrained by a neighbors yard on one side).
I create the fallen branches in their final state along the fall plane, and create individual correction vectors mapping them back to their original state, but in an order that does not intersect other branch vectors.
The idea came to me as a particularly difficult tree needed to come down in my friends yard, and we spent hours planning it out. I've already gotten some interest from the tree-surgeon community, I just need to appify it.
Second rendition will treat the problem more as a physics one than a graph one, with some energy-minimisation methods for solving.
This is the kind of thing that makes me love HN. An idea I would never have thought of, with an immediately obvious use in multiple ways (fall path plus ideal lumber cutting?), probably very difficult, yet being tackled with one implementation already... and spoken of quite humbly.
Having joined may father and his friend during the process of cutting down big trees in the village neighborhood I can personally vouch that this indeed a cumbersome and very complex task for both the planning and the execution phases. However, for us the tasks are made easier by the trimming of the tree's branches since my father's friend is the expert tree climber.
From your descriptions, it seems that your tree cutting procedures do not involved precut of the tree's branches before cutting the tree down.
I've got the feeling that this cutting tree problem can be solved by constraint programming techniques [1],[2]. Alternatively generic tools for constraint programming, for example OR-Tools and MiniZinc can probably do the same if not better [3],[4].
[1] Logic, Optimization, and Constraint Programming: A Fruitful Collaboration - John Hooker - CMU (2023) [video]:
https://www.youtube.com/live/TknN8fCQvRk
[2] "We Really Don't Know How to Compute!" - Gerald Sussman - MIT (2011) [video]:
https://youtube.com/watch?v=HB5TrK7A4pI
[3] Google OR-Tools:
https://developers.google.com/optimization
[4] MiniZinc:
Testing this is real pain in the ass, you gotta cut real tree to see if it works in various situations :(
Funny, one of mine also involves trees -- but is mostly outdoor cleanup. The kind that involves decades' worth of it, thanks to what I'll just say is a lot of maintenance that wasn't done over a long time. There's an extensive amount of brush, leaves, etc of varying ages that could maybe be shredded up into something useful, invasive vines I'm still trying to deal with, and more old trash than I've fully figured out how to properly dispose of.
It's turning into various DIY rabbit holes, actually, with the next one (outside of various related landscaping stuff) being to gut a basement.
I would love to have such a model tell me how to prune my fruit trees as they grow up. Should be a fairly straightforward supervised problem with the right front end for the graph generation.
You can start right now with an algorithm I learned from an expert when I was working in a landscaping business.
It is a very simple three-pass plan: "Deadwood, Crossovers, Aesthetics".
So, first pass, go through the tree cutting out only and all the dead branches. Cut back to live stock, and as always make good clean angle cuts at a proper angle (many horticulture books will provide far better instructions on this).
Second pass, look only for branches that cross-over other branches and especially those that show rubbing or friction marks against other branches. Cut the ones that are either least healthy or grow in the craziest direction (i.e., crazy away from the normal more-or-less not radially away from the trunk).
Then, and only after the other two passes are complete, start pruning for the desired look and/or size & shape for planned growth or bearing fruit.
This method is simple and saves a LOT of ruined trees from trying to first cut to size and appearance, then by the time the deadwood and crossovers are taken later, it is a scraggly mess that takes years to grow back. And it even works well for novices, as long as they pay attention.
I'd suspect entering the state and direction of every branch to an app would take longer than just pruning with the above method, although for trees that haven't fully leafed out, perhaps a 360° angle set of drone pics could make an adequate 3D model to use for planning?
In any case, good luck with your fruit trees — may they grow healthy and provide you with great bounty for many years!
When i read OP this is what I thought it was going to be - these branches are going to be apex competitors, these are crossing or going to cross, this one shows signs of disease, this one interrupts air flow through the centre, etc.
I was imagining something like this for pruning fruit trees — something to help noobs like me see how to put pruning guidelines into practice on a real, overgrown tree. Good luck!
That’s a great idea, but so much liability if the user is an amateur and follows steps incorrectly
Making this determination alone will sink you in legal fees
Does an insane amount of fine print really save you? Even if you say the model is only an aide to be used by licensed or certified professional arborists or whatever, I fear some Joe blow whose tree lands on his house will be suing you.
I've been thinking for years about a safer alternative to chain saws. Something along the lines of a carbide coated wire driven by an electric motor and battery. Strap it to the tree, turn it on, walk away and some minutes later the tree falls down. The main difficulty is in how to drive the wire. Using friction would create fast wearing parts. Maybe a chain could be used instead of a wire. It could oscillate back and forth, instead of having to be wrapped and spliced to form a circle around the tree. It seems really strange that no one has come up with an alternative to chain saws for decades (except for large scale trucks that can process whole trees.) For small trees and branches even a sawz-all is safer than a chain saw. Inspired by spending some time sharing a hospital room with a guy who had a chain saw accident, but I still haven't come up with a workable idea. Maybe someone else can.
I was thinking I could use a tool just like the wire you described to remove a stump, after I spent 6 hours with a 5ton Bobcat trying to dig up a 3ft diameter pine stump to no avail today. For felling trees though, you need precise front cuts/back cuts to drop the tree at a desired angle, you can't just cut in one direction even if you have a cable attached.
Would burning the stump work?
- https://www.youtube.com/watch?v=XTeGbunc_Sk
Perhaps drilling in a wedge shape so it weakens the branch and it eventually breaks off naturally but it seems like more work than just a chain saw. The holes could also be used for steam treatment, enzymes [0] or something else to break it.
I use a tool called the Alligator. Is is a tool you can use like scissors. It has 2 chains on the inside of the the business ends. You put around the branch. Close the end and press the button. Springs will then close the end even more and cut the wood. No open chains
I hear you on the safety angle, but chainsaws work really well. They're also very versatile.
We use a winch to guide the branches down, but would never apply the winch directly to the tree in case of whiplash when the branch finally breaks
Forgive my ignorance but all the tree cutting I've observed has been based on climbing and cutting in segments from the top rather than letting the tree fall. Under what circumstances is it better or necessary to actually let the tree fall?
IANAL (L=lumberjack) but it's clearly going to be cheaper if you can just chop it down, right? Quicker and less equipment required, less danger to life from having to climb and wield a chainsaw in an elevated position. Also, if you are interested in getting long planks out of the trunk, you would not want to cut it down progressively.
Any tree near civilization that someone wants to pay to cut down is likely going to be at a distance from a structure, powerlines, or pipes in the ground (that's also the reason they want it removed) where felling it as it lies is not likely an option which is why you see them sectioned down or done with a crane or bucket truck. A lot of these trees done by tree care companies in residential or commercial areas are also sprawling hardwood trees like oaks that can't always be safely just dropped from the ground even if there's a clear area to drop it in. For logging or wildland firefighting, most of the trees being dropped are straight growing softwoods like pine or fir that are just dropped from the ground (or by machine).
I think it depends on how much space there is for the tree to safely fall. If there isn’t enough space to accommodate the height of the tree, it needs to be done in controlled segments.
I was thinking of something similar during pruning season for my apple trees a few months ago. I even went so far as to take a scan of one of my trees with Luma and had it generate a 3D render of it. This worked surprisingly well, though it did take several days to get it rendered as it seemed their service was saturated.
My need/idea was to post that some where (r/backyardorchard probably) to get help in determining which limbs to prune. However, there didn't seem to be an easy way to share that sort of thing and time was of the essence, so I just forged ahead on my own.
Where I live this could be very helpful becuase people is too, how to say it, maybe ignorant in safety and logic specs. Also could be usefull to know or estimate what tree are in a innminent or highr posibilities of fall with wind.
Happy to help!
Cool idea. Just wondering why you wouldn't use Lidar for this? I'd have thought the spatial fidelity of a Lidar model would provide a much better model of the weight distribution of a tree.
Do consider the value of the wood in relation to your cuts. A well-placed cut not only guarantees safety but will also take the maximum board feet from the tree.
i work a lot with NVEL for this. one time even tried porting nvel to wasm for fun and client accessibility. we "virtually buck" trees which seems like could be applied to your proposed use case. if op wants to go down this path: https://github.com/FMSC-Measurements/VolumeLibrary/tree/77d4...
Seems insignificant. What are you optimizing for- an extra foot or two?
Yes, board feet is usually measured by the inch.
with dimension lumber it's way more about the width you can cut than length; sometimes shorter is more valuable depending on supply & demand (and transport). Accounting for the fact that trees are not perfect cylinders (or cones, really) is where all the fun optimization comes from anyways.
good ol conical frustum
Then just make the cut as low to the ground as possible. You don’t need a lot of complex math for that.
the right cuts at the right heights while working down the tree from a specific max height of the tree to still produce viable board feet while maximizing boards per cut. in most places, unless youre pulping the entire tree, its quite a bit more complicated than cut as low as possible.
its surprising to me how little work is done to make the tools which do this accessible considering how much money and open data there is.
it gets less open and more complicated is when you consider certain mills only can make certain cuts, produce certain products, and accept certain logs. then factor in distance between mills and the products they can make, and also log lengths accepted by the trucks which can travel those routes.
its all solvable and should be, but its so niche and that i still think there isnt an accessible solution
No offense, but this comment is very reductionist. The problem isn’t nearly as simple as you’re making it out to be.
Yes, and the wider the more it costs per bf as well.
I have a couple products I make that require 12" widths, which means I pay a whole lot more per bf than < 10" widths at my hardwood supplier.
Yes, but most trees are plenty tall enough.
You should branch out (hehe) into flower and plant pruning suggestions with your app. ChatGPT can do this now if prompted.
This is great idea - I have a huge tree in front yard that will either cost be $5-10k to come down or was going to rent lift and do it myself - A few particular branches scare me though in terms of how they will come down... Bonus points for where to tie things off.
Trees cost a lot of money to bring down. I've ideas for an automated cutter but it's a surprisingly difficult problem.
i work in forestry software and am curious about your methods. is any of this open source? any intention on supporting growth modeling?
I plan for a time-bomb license (closed source for 10 years, make my money (if any), GPLv3 after that).
My methods are all over the place. Tree is taken as-is on the day, and cuts calculated on the fly, no future growth-modelling if that is what you're asking
I was mixing methods, sorry. My initial rendition for solving the cuts would initialise a somewhat sparse network from tree to ground, and solve for non-overlapping paths.
This became convoluted and I just opted for a far easier method of solving vector intersections.
Its also not perfect since I haven't factored in rotation origin very well, and I'm now pursuing a far simpler physics-based approach
Perhaps an opportunity for weed control for lawn as well.
I’m working on Popgot (https://popgot.com), a tool that tracks unit prices (cost per ounce, sheet, pound) across Costco, Walmart, Target, and Amazon. It normalizes confusing listings (“family size”, “mega pack”, etc.) to surface the actual cheapest option for daily essentials.
On top of that, it uses a lightweight AI model to read product descriptions and filter based on things like ingredients (e.g., flagging peanut butter with BPA by checking every photograph of the plastic or avoiding palm oil by reading the nutrition facts) or brand lists (e.g., only showing WSAVA-compliant dog foods). Still reviewing results manually to catch bad extractions.
Started this to replace a spreadsheet I was keeping for bulk purchases. Slowly adding more automation like alerting on price drops or restocking when under a threshold.
I don't think I have the time to go to different stores to buy different things based on what is cheap. I have one fixed one.
However, what I would like is a product where I upload my shopping receipt for a few weeks/months from the one store I go to. The application figures out what I typically buy and then compares the 4-5 big stores and tells me which one I should go to for least price.
Yeah, I agree. It is a pain to search product by product instead of sticking to one store. Also popgot.com can only do what's online & shipped to you -- so really just the non-perishables / daily essentials that are not fresh groceries. But even when limited to consumables I save ~$100/mo by basically buying by unit price.
Uploading a receipt to see how much you can save... that's a good idea. I think I can find your email via your personal site. Can I email you when we have a prototype ready?
A one time email is fine.
However, I am in Canada. So can only test it once you expand there. Thanks.
I don't know how things are in the US, but it does seem like the grocery store oligopoly is squeezing consumers a lot, so tools like this are valuable for injecting competition into the system.
Shameless plug for my own project (https://grocerytracker.ca/) since you're in Canada. Eventually I'd love for it to do what you're suggesting, but for now the closest thing you can do is create a basket for each store with the same items and then check each week to see which is the cheapest.
This is a great idea. And OCR should be good enough nowadays to parse the receipts. Probably would work best as a mobile app, though.
Have you looked at receipts? They’re narrow, only do one line per item, and every store prints something different for the same product. It usually includes a store specific sku, the price and some truncated text on that single line. Good luck figuring out exactly what someone purchased from a random receipt.
Awesome site. You've probably come across it, but just in case you haven't. In the UK we have trolley.co.uk (plus app) which is handy. The barcode scanner I use a lot when I want to check if the branded product is a good price in the shop i'm standing in or if i'm getting ripped off. They have all products (I assume because online grocery shopping is bigger here?). Personally, I'm looking to start online shopping (new dad so time poor), it'd be great if I could build a shopping list and a site tell me which online grocer to order from for the best value, with basket price breakdown for each.
This is so good I disabled my ad blocker.
Thank you. Seriously.
Note: I searched "Protein bars", and it treated all protein bars equally. The 1st-20th cheapest had <15g of protein per bar. I had to scroll down to the 50th-60th to find protein bars with 20g of protein, which surprised me for being cheaper than Kirkland Signature's protein bars.
My pleasure! Happy you could use it as much as I do. Anyway we can chat in person? I'd love to make more stuff for you. chris@<our site>.com
I like this idea a lot -- feels like there's a lot of room to grow here. Do you have any sort of historical price tracking/alerting?
And/or also curious if there is a way to enter in a list of items I want and for it to calculate which store - in aggregate - is the cheapest.
For instance, people often tell me Costco is much cheaper than alternatives, and for me to compare I have to compile my shopping cart in multiple stores to compare.
> For instance, people often tell me Costco is much cheaper than alternatives, and for me to compare I have to compile my shopping cart in multiple stores to compare.
A few years ago, I was very diligently tracking _all_ my family's grocery purchases. I kept every receipt, entered it into a spreadsheet, added categories (eg, dairy, meat), and calculated a normalized cost per unit (eg, $/gallon for milk, $/dozen eggs).
I learned a lot from that, and I think I saved our family a decent amount of money, but man it was a lot of work.
Glad you guys mentioned Costco -- I happen to have written a blog post on exactly that: https://popgot.com/blog/retailer-comparison Surprisingly, Costco does not win most of the time, and especially if you are not brand loyal. Costco has famously low-margins, but it turns out that when you sort by price-per-unit they're ok, but not great.
@mynameisash I'm curious what you learned... maybe I can help more people learn that using Popgot data.
One thing to call out is that costco.com and in-person have different offerings (& prices) -- but you probably know that already.
I just dusted off my spreadsheet, and it's not as complete as I'd like it to be. I didn't normalize everything but did have many of the staples like milk and eggs normalized; some products had multiple units (eg, "bananas - each" vs "bananas - pound"); and a lot of my comparisons were done based on the store (eg, I was often comparing "Potatoes - 20#" at Costco but "Potatoes - 5#" at Target over time).
Anyway, Costco didn't always win, but in my experience, they frequently did -- $5 peanut butter @ Costco vs $7.74 @ Target based on whatever size and brand I got, which is interesting because Costco doesn't have "generic" PB, whereas Target has much cheaper Market Pantry, and I tried to opt for that.
My family’s favorite experience has been that Costco usually doesn’t have the cheapest option but it has a good value option.
Our main example is something like pasta. Our local grocery stores all carry their own brand of dirt cheap pasta but it’s not as good as the more expensive pasta at Costco. Comparable pasta at the local grocer would be more expensive.
For items that are carried at both stores, Costco is usually no cheaper than the regular retail price and rarely much more expensive.
The quality difference I find between Costco and Walmart is significant, even if the price is not that different.
I'm so glad you like it!
We have historical price tracking in the database, but haven't exposed it as a product yet. What do you have in mind / what would you use it for?
There is a project linked to the Open Food Facts nonprofit of collecting prices of any products (food or other) with bar codes https://prices.openfoodfacts.org/about. They have a system for automatic price detection from labels and working on one from receipts.
I like that you have the ability to exclude on some dimension (eg, I don't use Amazon.com). Do you have or are you considering adding more retailers beyond the four you mentioned? For example, I buy a lot of unroasted coffee from sweetmarias.com, and excluding Amazon from Popgot results eliminates all but one listing (from Walmart).
Ah, hell yeah! My buddy on this project has been itching to add sweetmarias.com ... he just needed this as an excuse.
So yeah, we'll add it. If you shoot me an email (or post it here?) to chris @ <our site>.com I'll send you a link when it's done. Should take a day or two.
Cool project!
I run tech for a reverse logistics business buying overstock from Costco/Target/Walmart and we’re building a similar system for recognizing and pricing incoming inventory. I sent an email a few days ago to see if you might be open to chatting.
It would be great to compare notes or explore ways to collaborate. Totally understand if things are busy!
God tier filtering. Do you mind sharing how you integrated AI into the filter system? Your "flagging peanut butter" example also makes me wonder if the LLM is tagging the product with a large number of attributes on each run so it's not prohibitively expensive.
Cool! I hope it's coming to Japan (I live) near future.
The first ever SQL debugger – runs & visualizes your query step-by-step, every clause, condition, expression, incl. GROUP BY, aggregates / windows, DISTINCT (ON), subqueries (even correlated ones!), CTEs, you name it.
You can search for full or partial rows and see the whole query lineage – which intermediate rows from which CTEs/subqueries contributed to the result you're searching for.
Entirely offline & no usage of AI. Free in-browser version (using PGLite WASM), paid desktop version.
No website yet, here's a 5 minute showcase (skip to middle): https://www.loom.com/share/c03b57fa61fc4c509b1e2134e53b70dd
Was thinking today... not a debugger but even a SQL progess bar, so I know that my add column will take say 7 hours in advance.
This is awesome! I’m work with a team of analysts and data engineers who own a pretty big snowflake data warehouse. We write a ton of dbt models and have a range of sql skill levels on the team. This would be the perfect way to allow more junior devs to build their skills quickly and support more complex models.
I would recommend you target data warehouses like snowflake and bigquery where the query complexity and thus value prop for a tool like this is potentially much higher.
Thank you, nice to get some idea validation from folks in the industry. For sure data warehouses are the top priority on my TODO list, I picked PG first because that's what I'm familiar with.
I can ping you via email when the debugger is ready, if you're interested. My email is in my profile
This would be incredible to understand why some queries execute slow; most of the time it's one of the steps in between that takes 99% of the execution time at our company. Do you record the time each step takes?
Can you not use EXPLAIN ANALYZE to identify steps that had the highest compute time? I think most databases have some form of this.
This is a great command everyone should know. We once had a long running database query that was blocking a pipeline (code was written in a week and of course became integral to operations). Ran it, 15 minutes of thinking, added a new index on an now important column, and cut the run time down from almost 30 minutes to 5 seconds.
I've never heard of this, and I'm pretty sure my coworkers haven't either. Thanks for mentioning it!
https://chatgpt.com/share/68104c37-b578-8003-8c4e-b0a4688206...
You're onto the original idea I started out with! Unfortunately it's very difficult to correlate input SQL to an output query plan – but possible. It's definitely in future plans
MSSQL has the execution plan thing that will tell you which steps are involved and how long they take.
Possibly look at https://duckdb.org/community_extensions/extensions/parser_to...
Even if not for DuckDB, you can use this to validate/ parse queries possibly.
Thanks for the suggestion! I am using https://github.com/tobymao/sqlglot, which magically supports most SQL dialects. And yes, support for DuckDB is also in future plans
Thanks! Would you mind sharing what would be your use cases?
At my job, all of our business logic (4 KLOC of network topology algorithms) is written in a niche query language, which we have been migrating to PostgreSQL. When an inconsistency/error is found, tracking it can take days, manually commenting out parts of query and looking at the results.
Am not the person you asked, but feel that it could have good value for education and learning as well, besides debugging.
Finish it, shut up, take my money! This looks really good - make a website just to make it possible to sign up for updates.
Thanks for the motivation to finish this as soon as possible :) I'm working on a basic landing page with screenshots/videos and a "get notified" button right now – shoot me an (empty, if you want) email (in profile) and I'll ping you as soon as it's ready
https://dequery.io :) Added a little signup form (with possibility of providing additional feedback)
That was quick. I wanted to be quick too, but learned that I was too eager and missed that the are optional fields to pick from. I then re-submitted with the same email (will it go through or you have uniqueness validation?) providing optional stuff this time. Maybe play with positioning of the optional fields so they would be more apparent before submitting email address.
Thanks for the feedback! I updated the form
Cool! We're dealing with many complex CTEs and costly queries. Would be useful to have those visualized one by one.
What database are you using? I'd be happy to hear about your usecases and hopefully help you, shoot me an email (in profile)
this is very cool! Where can I follow you to see updates?
For now, yes, but I'll start working on adding support for all other DBs (especially OLAP) as soon as possible. The geberal approach is the same, I just have to handle all the edge cases of the SQL dialects
This comment is still (surprisingly!) gaining traction, so leaving this here
Nice one!
Following...