Found the spreadsheet https://goo.gl/z8nt3A
And the source: https://www.hardwareluxx.de/community/threads/die-sparsamsten-systeme-30w-idle.1007101/
Found the spreadsheet https://goo.gl/z8nt3A
And the source: https://www.hardwareluxx.de/community/threads/die-sparsamsten-systeme-30w-idle.1007101/
Still you can calculate how much you will save with 2w power reduction with selling this one and buying different NAS.
You can reduce the disk idle time after access to 5-15 min for better power saving.
Maybe you are looking at the wrong thing. CPU + motherboard controllers idle state matters more than spun down hdds
I saw a spreadsheet somewhere of a lot of cpu + motherboard combinations with idle power consumption for ultra low energy NAS optimisation.
If this is a textbook that you need to have in class. I would say go to the print shop and order couple of copies for you and your classmates (They also want cheaper textbooks). I think the biggest problem will be to have usable binding as loose or stapled paper won’t cut it. A print shop will have the machines and expertise to do it relatively cheaply.
I saw once pirated textbook in class and it was done like that. I think half the class had a pirated copy.
Modem translates fiber signals / DSL into twisted pair cable
Acces point translates twisted pair into wifi
I think you are looking for all in one router
For AI/ML workloads the VRAM is king
As you are starting out something older with lots of VRAM would be better than something faster with less VRAM for the same price.
The 4060 ti is a good baseline to compare against as it has a 16GB variant
“Minimum” VRAM for ML is around 10GB the more the better, less VRAM could be usable but with sacrefices with speed and quality.
If you like that stuff in couple of months, you could sell the GPU that you would buy and swap it with 4090 super
For AMD support is confusing as there is no official support for rocm (for mid range GPUs) on linux but someone said that it works.
There is new ZLUDA that enables running CUDA workloads on ROCm
https://www.xda-developers.com/nvidia-cuda-amd-zluda/
I don’t have enough info to reccomend AMD cards
The langues create natural barriers on the internet and it creates language specific sites that deal only in one language.
FMHY has quite a few entries in non-english section
Removed by mod
https://lemmy.ml/post/10284661
No clear reason stated.
Removed by mod
I think short event or campaign with push for donations with a pop up that you actually can dismiss. An ad like banner. The biggest problem would be community organization as Lemmy isn’t only decentralized horizontally but also vertically. Different front ends, different apps different instances. Most of them wouldn’t want to implement an ad that wouldn’t benefit them directly. They also have costs with running their piece of lemmy. So some cut for them should be included.
I think a dedicated trustworthy person should be responsible for organizing this campaign as developer time is best spent elsewhere.
I think more important is compute per watt and idle power consumption than raw max compute power.