b3ta.com links
You are not logged in. Login or Signup
Home » links » Link 1629130 | Random (Thread)

This is a normal post
You're probably right when it comes to training which is likely to remain centralised. Local inference (hopefully on-device) sounds right with less data transfer and latency. Smaller models have been shown to be just as good for everyday use.
(, Fri 9 Jan 2026, 9:59, Reply)