Skip to content

Commit c81813a

Browse files
committed
new On-device AI / edge Ai article based on the Droidcon presentation: On-device AI goes mainstream on Android
1 parent 566984a commit c81813a

File tree

2 files changed

+195
-0
lines changed

2 files changed

+195
-0
lines changed
Lines changed: 194 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,194 @@
1+
---
2+
title: " On‑Device AI Goes Mainstream on Android"
3+
description: "On-device AI / Edge AI / Mobiel AI / Local AI - whatever the name; it is already very possible today and has many benefits. Here's how you can get started (now or whenever you're ready)"
4+
slug: edge-ai-anywhere-anytime
5+
image:
6+
---
7+
8+
import Head from '@docusaurus/Head';
9+
10+
# On‑Device AI Goes Mainstream on Android
11+
This article is a written recap of my [Droidcon Berlin 2025 talk](https://www.youtube.com/watch?v=jwOToFCQ41Y), so the focus is on Android and Mobile AI in the hands-on, practical part. You can [find the slides here](#) (slideshare/pdf link). In this talk, we explored why the shift towards Edge AI matters, especially for developers, and how developers can get started and what to .
12+
13+
:::note
14+
**Note:** Edge AI may also be called **On-device AI**, **Mobile AI**, or **Local AI**.
15+
:::
16+
17+
Artificial Intelligence (AI) is shifting from the cloud to the **edge** — onto our phones, cars, and billions of connected devices. This move, often described as **Edge AI** ([What is Edge AI?](https://objectbox.io/on-device-vector-databases-and-edge-ai/)), unlocks AI experiences that are private, fast, and sustainable.
18+
19+
---
20+
21+
## Why Edge AI Now?
22+
23+
Two megatrends are converging:
24+
25+
- **[Edge Computing](https://objectbox.io/dev-how-to/edge-computing-state-2025)** - Processing data where it is created, on the device, locally, at the egd of the network, is called "Edge Computing" and it is growing
26+
- **AI** - AI capabilities and use are expanding rapidly and without a need for further explanation
27+
<img src="/static/img/edge-ai/edge-ai.jpg" alt="Edge AI: Where Edge Computing and AI intersect" />
28+
29+
--> where these two trends overlap (at the intersection), it is called Edge AI (or local AI, on-device AI, or with regards to a subsection: "Mobile AI")
30+
31+
The shift to Edge AI is driven by use cases that:
32+
* need to work offline
33+
* have to comply with specific privacy / data requirements
34+
* want to transfer more data than the bandwidth will allow
35+
* need to meet realtime or (QoS) specific reponse rate requirements
36+
* are not economically viable when using the cloud / a cloud AI
37+
* want to be sustainable
38+
39+
<img src="/static/img/edge-ai/edge-ai-benefits.jpg" alt="Edge AI drivers (benefits)" />
40+
41+
If you're interested in the sustianability aspect, see also: [Why Edge Computing matters for a sustainable future](https://objectbox.io/why-do-we-need-edge-computing-for-a-sustainable-future/)
42+
43+
## Why it's not Edge AI vs. Cloud AI - the reality is hybrid AI
44+
45+
Of course, while we see a market shift towards Ede Computing, there is no Edge Computiung vs. Cloud Computing - the two complement each other and the question is mainly: How much edge does your use case need?
46+
47+
<img src="/static/img/edge-ai/cloud-to-edge-continuum.jpg" alt="Edge AI drivers (benefits)" />
48+
49+
Every shift in computing is empowered by core technologies
50+
<img src="/static/img/edge-ai/computing-shifts-empowered-by-core-tech.jpg" alt="Every shift in computing is empowered by core technologies" />
51+
52+
## What are the core technologies empowering Edge AI?
53+
54+
If every megashift in computing is powered by core tech, what are the core technologies empowering the shift to Edge AI?
55+
56+
Typically, Mobile AI apps need **three core components**:
57+
1. An **on-device AI model (e.g. [SLM](https://objectbox.io/the-rise-of-small-language-models/))**
58+
2. A [**vector database**](https://objectbox.io/vector-database/))
59+
3. **Data sync** for hybrid architectures ([Data Sync Alternatives](https://objectbox.io/data-sync-alternatives-offline-vs-online-solutions/))
60+
61+
<img src="/static/img/edge-ai/core-tech-enabling-edge-ai.jpg" alt="The core technologies empoewring Edge AI" />
62+
63+
64+
## A look at AI models
65+
66+
### The trend to "bigger is better" has been broken - the rise of SLM and Small AI models
67+
68+
Large foundation models (LLMs) remain costly and centralized. In contrast, [**Small Language Models (SLMs)**] bring similar capabilities in a lightweight, resource-efficient way.
69+
70+
<img src="/img/edge-ai/slm-quality-cost.png" alt="SLM quality and cost comparison" />
71+
- Up to **100x cheaper** to run
72+
- Faster, with lower energy consumption
73+
- Near-Large-Model quality in some cases
74+
75+
This makes them ideal for **local AI** scenarios: assistants, semantic search, or multimodal apps running directly on-device. However....
76+
77+
### Frontier AI Models are still getting bigger and costs are skyrocketing
78+
<img src="/img/edge-ai/slm-quality-cost.png" alt="SLM quality and cost comparison" />
79+
80+
### Why this matters for developers: Monetary and hidden costs of using Cloud AI
81+
82+
Running cloud AI comes at a cost:
83+
84+
- **Monetary Costs**: Cloud cost conundrum ([Andressen Horowitz 2021](https://a16z.com/the-cost-of-cloud-a-trillion-dollar-paradox/)) is fueled by cloud AI; margins shrink as data center and AI bills grow ([Gartner 2025](https://x.com/Gartner_inc/status/1831330671924572333
85+
))
86+
- **Dependency**: Few tech giants hold all major AI models, the data, and the know-how, and they make the rules (e.g. thin AI layers on top of huge cloud AI models will fade away due to vertical integration)
87+
- **Data privacy & compliance**: Sending data around adds risk, sharing data too (what are you agreeing to?)
88+
- **Sustainability**: Large models consume waqy more energy, and transmitting data unnecessarily consumes way more energy too (think of this as shopping apples from New Zealand in Germany) ([Sustainable Future with Edge Computing](https://objectbox.io/why-do-we-need-edge-computing-for-a-sustainable-future/)).
89+
<img src="/img/edge-ai/slm-quality-cost.png" alt="SLM quality and cost comparison" />
90+
91+
### What about Open Source AI Models?
92+
93+
Yes, they are an option, but be mindful of potential risks and caveats. Be aware that you also pay to be free of liability risks.
94+
<img src="/img/edge-ai/slm-quality-cost.png" alt="SLM quality and cost comparison" />
95+
96+
### While SLM are all the rage, it's really about specialised AI models in Edge AI (at this moment...)
97+
<img src="/img/edge-ai/slm-quality-cost.png" alt="SLM quality and cost comparison" />
98+
99+
100+
## On-device Vector Databases are the second essential piece of the Edge AI Tech Stack
101+
102+
- Vector databases are basically [the databases for AI applications](https://objectbox.io/empowering-edge-ai-the-critical-role-of-databases/). AI models work with vectors (vector embeddings) and vector databases make working with vector embeddings easy and efficient.
103+
- Vector databases offer powerful vector search and querying capabilities, provide additional context and filtering mechanisms and give AI applications a longterm memory.
104+
- For most AI applications you need to use a vector database, e.g. Retrieval Augmented Generation (RAG) or agentic AI, but they are also used to make AI apps more efficient, e.g. reducing LLM calls and providing faster responses.
105+
106+
:::info
107+
On-device (or Edge) vector databases have a small footprint (a couple of MB, not hundreds of MB) and are optimized for efficiency on resource-restricted devices.
108+
:::
109+
110+
(Note: Edge Vector databases, or on-device vector databases, are still rare. ObjectBox was the first on-device vector database available on the market. Some server- and cloud-oriented vector databases have recently begun positioning themselves for edge use. However, their relatively large footprint often makes them more suitable for laptops than for truly resource-constrained embedded devices. More importantly, solutions designed by scaling down from larger systems are generally not optimized for restricted environments, resulting in higher computational demands and increased battery consumption.)
111+
112+
<img src="/img/edge-ai/vector-database.png" alt="Vector Databases" />
113+
114+
115+
## Developer Story: On-device AI Screenshot Searcher Example App
116+
117+
To test the waters, I built a [**Screenshot Searcher** app with ObjectBox Vector Database](https://github.com/objectbox/on-device-ai-screenshot-searcher-example):
118+
119+
- OCR text extraction with ML Kit
120+
- Semantic search with MediaPipe and ObjectBox
121+
- Image similarity search with TensorFlow Lite and Objectbox
122+
- Image categorization with ML Kit Image Labeling
123+
124+
This was easy and took less than a day. However, I learned more with the stuff I tried that wasn't easy... ;)
125+
126+
### What I learned about text classification (and hopefully helps you)
127+
<img src="/img/edge-ai/on-device-text-classification.png" alt="On-device Text Classification Learnings" />
128+
129+
--> See Finetuning.... without Finetuning, no model, no text classification.
130+
131+
### What I learned about finetuning (and hopefully helps you)
132+
<img src="/img/edge-ai/finetuning-text-model-learnings.png" alt="Finetuning Learnings (exemplary, based on finetuning DBPedia)" />
133+
134+
--> Finetuning failed --> I will tray again ;)
135+
136+
### What I learned about integrating an SLM (Google's Gemma)
137+
138+
Integrating Gemma was super straightforward; it worked on-device in less than an hour (just don't try to use the Android emulator (AVD) - it's not recommended to try and run Gemma on the AVD, and it also did not work for me).
139+
<img src="/img/edge-ai/using-gemma-on-android.png" alt="Using Gemma on Android" />
140+
141+
142+
In this example app, we are using Gemma to enhance the screenshot search with an additional AI layer:
143+
- Generates intelligent summaries from OCR text
144+
- Create semantic categories and keywords
145+
- Enhance search queries with synonyms and related terms
146+
147+
148+
## Overall assessment of the practical, hands-on state of On-device AI on Android
149+
150+
151+
It's already fairly easy - and vibe coding an Edge AI app very doable. While of course I would recommend the latter only for prototyping and testing, it is amazing what you can do on-device with AI already, even not being a developer!
152+
153+
154+
155+
<img src="/img/edge-ai/final-tech-stack.png" alt="Final Tech Stack" />
156+
157+
158+
159+
160+
---
161+
162+
## Key Questions to Ask Yourself
163+
164+
- How much **edge vs. cloud** do you need?
165+
- Which tasks benefit from **local inference**?
166+
- What data **must remain private**?
167+
- How can you make your app **cost-efficient** long term?
168+
169+
---
170+
171+
## How to Get Started
172+
173+
- Learn about [Local AI](https://objectbox.io/local-ai-what-it-is-and-why-we-need-it/)
174+
- Explore [Vector Databases](https://objectbox.io/vector-database/)
175+
- Prototype with the [On-device AI Screenshot Searcher Example](https://github.com/objectbox/on-device-ai-screenshot-searcher-example)
176+
- Consider [Data Sync](https://objectbox.io/data-sync-alternatives-offline-vs-online-solutions/) for hybrid apps
177+
- Read more on [Empowering Edge AI with Databases](https://objectbox.io/empowering-edge-ai-the-critical-role-of-databases/)
178+
179+
---
180+
181+
## Conclusion
182+
183+
We’re at an inflection point: AI is moving from centralized, cloud-based services to decentralized, personal **on-device AI**. With **SLMs**, **vector databases**, and **data sync**, developers can now build AI apps that are:
184+
185+
- Private
186+
- Offline-first
187+
- Cost-efficient
188+
- Sustainable
189+
190+
The future of AI is not just big — it’s also **small, local, and synced**.
191+
192+
<img src="/img/edge-ai/ai-anytime-anywhere.png" alt="AI Anytime Anywhere Future" />
193+
194+
---

sidebars.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -35,6 +35,7 @@ const sidebars: SidebarsConfig = {
3535
items: [
3636
'edge-computing-edge-ai-local-ai-marketanalysis',
3737
'on-device-vector-database-sync',
38+
'on-device-ai-goes-mainstream',
3839
],
3940
},
4041
],

0 commit comments

Comments
 (0)