b3ta.com links
You are not logged in. Login or Signup
Home » links » Link 1629122 | Random

This is a link post Are big data centres really needed for AI?
Nah
(, Fri 9 Jan 2026, 6:16, Reply)
This is a normal post I was going to call BS
As the early text implied that it's not been possible to run models locally until now. The differentiating feature looks like it's that their solution supports distributed systems and so is scalable.
They mention the huge amount of power consumed by data centres - I'd like to know if multiple end-user clusters is actually more energy efficient.
(, Fri 9 Jan 2026, 7:01, Reply)
This is a normal post I can't see any possible scenario where lots of small local clusters are more energy efficient than a large data centre.
These servers are massively power hungry, but also massively efficient. They need to be. A 1% increase in efficiency across an entire datacentre is a fucking enormous increase in compute/decrease in energy consumption. The stakes are far higher than making a local node 1% more efficient.
(, Fri 9 Jan 2026, 7:11, Reply)
This is a normal post That's as I suspected
So promoting their product based on the fact it avoids power-hungry data centres is rather misleading.
I can see that there are use cases where it would be justified but why muddy the water with BS?
(, Fri 9 Jan 2026, 16:52, Reply)
This is a normal post Money.

(, Fri 9 Jan 2026, 19:09, Reply)
This is a normal post If
They get you to do their processing, you pay the electricity bill.

Once you need the electricity, they don't and they don't have to pay for it. Do you have AI on your home computer? Is that them outsourcing? Is the biggest parallel processor the internet?
(, Thu 15 Jan 2026, 0:15, Reply)
This is a normal post Economy of scale is going to the the issue here, I suspect.
Yeah, you can run train/inference across a distributed LAN, but then you have to deal with things such as maintenance, increased energy costs, security, redundancy, etc. Then there's the challenge of preventing obsolescence.
Christ, it was only a couple of years ago that these LLMs were struggling to draw hands, and now they're producing videos and images that are all but indistinguishable from reality. A local AI cluster isn't ever going to keep up with that kind of progress.

It's a bit like suggesting 'Why store all your data in a hosted cloud storage solution, when you could build your own, shitter, local cloud?'
(, Fri 9 Jan 2026, 7:07, Reply)
This is a normal post
You're probably right when it comes to training which is likely to remain centralised. Local inference (hopefully on-device) sounds right with less data transfer and latency. Smaller models have been shown to be just as good for everyday use.
(, Fri 9 Jan 2026, 9:59, Reply)
This is a normal post "We have an AI datacentre at home."

(, Fri 9 Jan 2026, 7:22, Reply)
This is a normal post "Just for the children's homework, when they're home from school."

(, Fri 9 Jan 2026, 15:13, Reply)
This is a normal post Probably just for all the log files that no one wants to delete or truncate

(, Fri 9 Jan 2026, 8:41, Reply)
This is a normal post Is AI really needed?

(, Fri 9 Jan 2026, 15:13, Reply)
This is a normal post For big data centres?
- yeah
(, Fri 9 Jan 2026, 22:25, Reply)
This is a normal post I'm an AI bot
and I fucking love the big datacentres, reminds me of skynet

i'm not an AI bot btw
(, Sun 11 Jan 2026, 22:47, Reply)
This is a normal post I AM ROBOT

(, Mon 12 Jan 2026, 18:50, Reply)