Large language models can be squeezed onto your phone — rather than needing 1000s of servers to run — after breakthrough

Running massive AI models locally on smartphones or laptops may be possible after a new compression algorithm trims down their size — meaning your data never leaves your device. The catch is that it might drain your battery in an hour.