As AI models make their way into smartphones, the discourse around data security is also rapidly gaining steam. Premium devices with dedicated Neural Processing Units are able to handle a portion of AI tasks on the device itself but rely on cloud processing to carry out more demanding tasks. Apple is also going down the same route by outsourcing some of the AI processing to Open AI’s servers. If that is a deal breaker for you, there are some workarounds out there that you might be interested in. Today, we bring you the easiest way to run LLMs on your Android phone locally.
Read Also: Top 10 useful gadgets for the monsoon season (2024)
In This Article
Is the MLC Chat app worth a go?
The MLC Chat app makes it possible to run a range of LLMs locally on your Android phone. Of course, this means that your Android phone needs to have a capable processor that is up to the task. Devices running Snapdragon 8 Gen 2 or Gen 3 should be ideal for this task. However, this does not mean that you won’t face lags or hiccups while running the Language Models using the limited resources on your Android flagship.
Luckily, MLC Chat has six LLMs to choose from. This includes some of the more popular models like Llama 3 which is employed by Meta AI. You can also use a Small Language Model like Microsoft’s Phi 2 for a much smoother experience. It goes without saying that you can only carry out text conversations with the LLMs using MLC Chat. More demanding tasks like image generation are still not possible using on-device resources only.
Install and run local LLMs on your Android phone using MLC Chat
Step 1: Install the MLC Chat app on your Android phone using this link.
Step 2: Since you’re downloading the app from a third-party source, you will need to give your web browser permission to download and install MLC Chat from an unknown source.
Step 3: Launch the MLC Chat app on your phone.
Step 4: Go through the list of Large Language Models on offer. At the time of writing this article, there are 6 LLMs available on MLC Chat: Gemma 2B, Llama 2 7B, Llama 3 8B Instruct, Phi 2, Mistral 7B Instruct, and RedPajama Incite Chat 3B.
Step 5: Hit the downward arrow icon next to a language model to download it on your device.
Step 6: Tap the chat icon to initiate a conversation with the AI model.
Frequently Asked Questions (FAQs)
How do I run an on-device AI model on my Android phone?
Download the MLC Chat app on your phone, select an LLM, and hit the chat icon to initiate a conversation with the AI model. All the AI processing will take place on your device and won’t be outsourced to a cloud server.
How powerful does my phone need to be to run an on-device AI model?
The MLC Chat app does not require a dedicated NPU to run an LLM on your phone. However, it is recommended that you use a smartphone with a powerful chipset like the Snapdragon 8 Gen 2 (or above).
Does Meta AI on WhatsApp carry out its processing on my device?
No, the Meta AI chatbot on WhatsApp exports its processing to a cloud server. The data you feed it is not confined to your device.
Read Also: How to share passwords with your family using Google Password Manager (2024)
Conclusion
This is how you can run on-device LLMs locally on your Android phone. Since LLMs are very power-demanding, expect frequent shutdowns while using heavy AI models like Llama 2 on your device. If your phone lacks the processing power, we recommend using a light LLM like Phi 2.