HACKER Q&A
📣 miohtama

Stable Diffusion locally on a mobile phone?


New Android phones comes with 6 GB or sometimes even 8 GB of integrated RAM. For example, Pixel 6 has 8 GB. Furthermore, flagship phones come with Tensorflow or other powerful GPU architectures.

As Stable Diffusion memory requirements are now squeezed under 7 GB, would there be any architectural reasons not to be able to run Stable Diffusion models locally on users’ phones, assuming phones are the most powerful the market currently offers?


  👤 imranq Accepted Answer ✓
Stable diffusion could likely get compressed much further with some tradeoff in quality. Then it can run locally on these phones. Like how pixel phones have built in speech to text models that run without an internet connection.

👤 O__________O
Why would you waste the processing time of an expensive phone to do this? Even if it was possible, likely be impossible to run anything else. If you prefer to work on a mobile device, make a lot more sense to just connect to a local server, remote host, cloud, or SAAS.

👤 josephcsible
I think it'd be theoretically possible, but also horrifically slow. Running it on the CPU of a desktop computer would probably be faster.