← All projects
AutomationSystems

PocketLawyer Edge AI

On-device legal assistant for Android with fully local LLM inference — no server, no data leaks.

On-device LLMAndroidZero API calls

The Problem

Legal questions in India are complex and inaccessible. Cloud-based AI leaks sensitive legal queries to third-party servers.

Approach

Built a fully on-device Android app with a quantized LLM running locally. Zero network calls for inference. Privacy-preserving by architecture.

Architecture

Quantized LLM model bundled with the APK. On-device inference via Android NDK bindings. Query preprocessing and context injection. Response streaming to chat UI.

Results

100% on-device inference. No data transmitted to any server. Viable on mid-range Android hardware.

App interface
App interface