Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On the flipside, does anyone know if there's any caveats to "recycling" a laptop's motherboard? I've got a laptop that has awful thermals but decent hardware. One of the fans is dead, and I've attempted to replace it but I guess the header itself is busted.* The processor and GPU are just wasted in there as they throttle hard on basic workloads. So, I've been meaning to rip the board out and make an HTPC out of it. I'm not too experienced with electronics to know if there's any issues I could run into, so I've been holding off on it. I would love to see if anyone else has done something like that.

* Unfortunately, it possible that my attempt to fix a dead fan is what busted the header. I'm not sure because the previous fan was dead and the new fan I put in didn't spin, but that's part of why I don't feel comfortable just winging it.



>does anyone know if there's any caveats to "recycling" a laptop's motherboard?

The biggest caveat in re-using old compute hardware IMO is power consumption, power-efficiency has been one consistent improvement in every new generation of computer chips. So, a new Raspberry Pi is almost always as capable as your old hardware for HTPC but more power efficient.

Except maybe say extremely high quality video, say 8K 10 bit video.

>Unfortunately, it possible that my attempt to fix a dead fan is what busted the header.

You can try passive heatsink or passing the fan thermal management to separate unit altogether like Corsair Commandar(I haven't tried it).

You can also use your hardware for only GPGPU applications & HW accel on GPU to reduce load on CPU e.g. Separate stream encoding machine.


If you're worried about power consumption from a cost perspective, it's going to take a long time to break even from the cost of new hardware, even for a cheap pi.

If you're worried about it from an environmental perspective, it often takes more energy / resource mining / pollution to manufacture a new piece of hardware than the total amount saved from not running older hardware over a longer lifetime. Running a computer / smartphone / tablet until it doesn't work well enough for its purpose is generally better for the environment than upgrading because shiny, because it slightly reduces the demand for new hardware.


> If you're worried about power consumption from a cost perspective, it's going to take a long time to break even from the cost of new hardware, even for a cheap pi.

That depends a lot on where you live. In Germany, prices scratch 0.3€/kWh. A new Pi sets you back 60€ [0]. Assuming you save 20W and it's on 24/7, you'll break even in just above a year[1].

[0] I know it's marketed as cheaper, but you never get it for 35€ and then you still need cables and an SD card.

[1] you save 52.50€ per year


Sure, and it never hurts to do the math for your particular situation. Being in California I'm probably paying about the same as you per kWh. I do think 20W delta is rather optimistic, since the last time I checked a decade ago, laptops only consume 10-15W at idle, and pi's likely consume at least a few watts themselves. Laptop hardware is generally pretty good about reliably suspending to ram and resuming.

My general point is that there's a lot of externalized costs to consider depending on what you're trying to optimize for. For example, I live in a somewhat chilly area most of the year, and part of my home is electrically heated rather than by natural gas. I don't particularly care how efficient any electronics are in that part of the house (when I'm using them) because that electricity is converted into useful heat just as well as a space heater would do it.


Good points as well!

> it never hurts to do the math for your particular situation

I think we can agree on that :)


I had a nightmare last night that my 3 year old smartphone's screen got horribly scratched enough to be a constant nuisance but not bad enough to interfere with its function, and I had to decide whether or not to give up and upgrade.


This isn't necessarily related to the discussion at hand, but it reminded me:

I got a huge crack (https://ibb.co/1mPxC6C) in my smartphone's screen right around New Year's. One of my goals was to use my phone less (less social media). The crack is a big, ~.25in black diagonal line across the screen, but the touch still works. Because it's a diagonal crack, I can still read whole text messages/emails/etc, I just have to scroll the right way.

It's now March and I still haven't fixed my screen, mostly out of laziness since I'm home all the time, but also because it's done wonders to my smartphone use. I practically don't idly scroll anymore, just because it's really annoying to do so, and I don't miss it at all. Everyone thinks I'm nuts, but I haven't read twitter in like 3 months so who's the crazy one?

I'd say, don't bother upgrading! Enjoy that the addictive quality of your smartphone is diminished.


Lol, thankfully it was just a dream and my screen is still flawless! I did stop bothering to put protective glass on it after that cracked a few months ago.

The only thing I read obsessively on my phone these days is hacker news. :-) I stopped using most social media 6 or 7 years ago.


> The biggest caveat in re-using old compute hardware IMO is power consumption, power-efficiency has been one consistent improvement in every new generation of computer chips. So, a new Raspberry Pi is almost always as capable as your old hardware for HTPC but more power efficient.

yes but your typical laptop rarely exceeds 30W, so if your electricity is priced at $0.12 per kWh, and you run that laptop 24x7 for a year:

22.32 kWh per month

same as

$2.67 of electricity


Or 3.6% of a Bitcoin transaction per month haha


Being snarky is fun, but in this case misleading.

Transactions don’t consume this electricity, competing for the next block does.


The block of transactions lol literally a problem for division. I could have written “1% of the energy required to process of a block of 3 transactions”


The quantity of transactions here is irrelevant, so is the power cost per transaction.


lol, ok, then the power cost per block of 3 transactions. Multiply by 3.


You multiply by zero as the transactions do not contribute to energy usage.


haha, without transactions you wouldn't need to mine anything would you. The network is already secure and the data valid. If there were no transactions the ledger wouldn't need to be appended to, and in that case you could just throw it into read-only storage and call it a day.

Transactions are the only thing that contributes to energy usage.


I think you are confused. The only thing that contributes to energy usage is mining blocks, aka the block reward subsidy. Without transactions, it would still happen that way. And does, in the form of empty blocks.


> I think you are confused. The only thing that contributes to energy usage is mining blocks, aka the block reward subsidy.

Which is a reward for processing transactions.

> Without transactions, it would still happen that way. And does, in the form of empty blocks.

If we eliminated the ability to transact Bitcoin, the ledger would be closed - finalized. A data blob plus a published SHA. Like any other archive.

Mining allows you to amend said ledger, by processing a batch of updates. It is the process by which you amend the ledger, grouped into blocks of 0 or more transactions.

Transactibility is what uses 100% of the energy of the bitcoin network and it's fine to quantize that energy usage on a per-block, and per-transaction basis.

Mining furnishes transactibility. Transactibility is quantized into blocks, blocks into transactions. I'm sorry, it's clear, transactions and the ability to execute them are what uses 100% of the power of Bitcoin. If not that, then what? Don't just say "mining" - explain what mining provides if not transactibility.


> Which is a reward for processing transactions.

No, it is a reward for finding the next block. Miners can optionally choose to include transactions if they wish.


A block of... 0 or more what now? Which wouldn’t need to exist if the ledger was immutable? Ergo... I mean you seem pretty intelligent so I’m gonna assume you get my point.

0 or more transactions form a block, which is mined to facilitate mutations. Mutations in the form of transactions. Nothing more. The block reward is compensation for providing the service of processing blocks of 0 or more transactions. A mining scheme that consistently has 0 transactions achieved nothing. End of story. If you disagree explain to me what else mining achieves.


No, you nailed it, 0 or more transactions. Power is not directly tied to transactions as a result.

Edit: in case you’re still struggling: mining yields the block subsidy reward. That is its primary function.


See it’s solely for transactions. Without transactions or mutability mining wouldn’t be required. This division is fair game. The block reward subsidy is only required to reward the processing of blocks which wouldn’t need to happen if transactions weren’t a feature.

The method of facilitating transactions consumes all the power, so you can divide the average number of transactions per block to obtain the power expended per average transaction.

Trust me I’m not struggling here, except with pushing back on the koolaid consumption. Folks who pursue extensive mental gymnastics to justify their having obtained wealth through environmental destruction: it’s like if everyone holding Bitcoin woke up one day to realize they’d killed a gorilla in the process. They’d try really hard to convince themselves they didn’t or that it wasn’t their fault. Even to pretend it didn’t happen because the gorillas were simply transmuted to a better life in the sky.


I disagree with your points, you can’t really generalize it all together and say “see look it’s the same!” when it is so clearly not. Good day.


First of all, thanks!

> The biggest caveat in re-using old compute hardware IMO is power consumption

This is a relatively recent laptop (it has a GTX 1060 inside!), and I can't get hardware that capable for cheap enough for it to be worth it. My only reason for making it an HTPC is that I don't want to see it gather dust on a shelf for no reason.

> You can try passive heatsink

I've considered that + other cheap ways, but I'm concerned about mounting. The CPU runs hot, even when well cooled, so I'd need some way to get good contact with the die without harming it. There's a few screw slots around it, but those are for the heat pipes it comes with. That's the biggest challenge in this project from my perspective.

> You can also use your hardware for only GPGPU applications & HW accel on GPU to reduce load on CPU e.g. Separate stream encoding machine

Unfortunately, I don't have any use cases where that could come in handy. Working a full-time job killed my dreams of producing content :'). I just want this to sit at my 4K TV and play content without snooping on me, like all the big name boxes do.


I have a 5 year old laptop whose display died after 2 years. Rather disappointing that high-end stuff doesn't last and manufactures don't seem to care, but that's another rant. I plugged it into my TV to use for emulators. It's fun to play occasionally, but overall I hate it because of how whiny and high maintenance windows 10 is. Every single freaking time I want to use it, I have to pull it out from under the blu-ray player and reassure it of its insecurities. Every few days it will randomly wake up and start loudly revving its fans up and down too, just to remind me that it exists to torment me.


if you're just running emulators... linux?


When I looked into it 5 years ago, there were supposedly issues with it having a non-standard NVME interface firmware or something. There's a good chance that it would just work these days, but in my experience getting linux to run reliably on a laptop that wasn't designed for it can be tricky. It was originally a work laptop, so linux compatibility wasn't originally a consideration.

Maybe I'll whip up a Linux live usb drive and try booting it this weekend!


Mineral oil submersion for cooling may work


Or, keeping it in a moisture-conditioned freezer. Or, ... That's it, that's the only impractical idea I had. Oh wait, water cooling!


This is the key for me, and a major reason I've always avoided the cheap, older hardware off of eBay for home use. My server rack consists of new (released) at time of purchase hardware with power efficiency as a major focus.

One measurement which really surprised me was that my 2019 XPS 13 idles with about 1.5W, screen on according to powertop. That's less power than a Raspberry Pi takes to idle!


I did this for my younger sister's first computer. We didn't have a lot of expendable cash, so I took my old laptop, stripped the motherblard out, mounted it on a makeshift open-air rig, and hooked it up to a second-hand monitor (laptop screen was irreparably broken). Served her well for three years or so.


How did you get good contact with the CPU die? I've considered an open air setup even before I had a backup PC to be bold with the laptop, but I wasn't sure it could cool the CPU.


It won't be pretty but if you can access the CPU from the top or bottom, you can strap a desktop heatsink down with zipties running both ways around the laptop.


I did this with a 2015 MacBook Pro, because I needed it to go full tilt and didn’t want it to throttle. https://twitter.com/nehbit/status/1243329494481432581?s=21

In my case, the weight of the fan connected to a 12v motorcycle battery was enough to make it not throttle at indefinite 100% load, I didn’t have to zip tie anything. Mind that I did not remove the existing fans, just used thermal interface material on top of the existing heat sink, add some salvaged aluminium passive cooling sinks on top, and then on top of that, have the fan screwed into an Amazon shipping box so that the fan blades didn’t hit the sinks.


This is a pretty popular video, so many may already be aware of this type of project, but just in case: https://www.youtube.com/watch?v=e3fnsGHe8eE


Let me add three additional DIY Perks video links, as they are also relevant to the original topic (hopefully I'm not going overboard with these):

- Build a DIY screen out of recycled parts for cheap: https://youtu.be/CfirQC99xPc

- Building a USB-C touchscreen monitor: https://youtu.be/DrqdHVeBkp4

- Transform a Damaged Laptop into an all-in-one desktop PC: https://youtu.be/8jeLCQ62vFk

The Lenovo with the bent bottom case appeared in many episodes as illustration, until it finally got repurposed!


You could just getting a 5v fan with a USB connector and plugging it into a USB port.


You could sit the whole thing in front of a normal desk fan until you can get a usb one, if everything else seems functional.


I've mentioned this in a sibling comment, but my concern with that is that it wouldn't be enough for a 45W CPU that runs fairly hot.


Leaked schematics are available for a lot of them (even famous ones like Apple, as Louis Rossmann can tell you...) Component-level repair can be possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: