Nvidia Jetson Nano

A big part of ACDC is experimenting with new ideas and technologies, even when the outcome is uncertain. As part of that, we set up an Nvidia Jetson Nano that may—or may not—end up as part of our final implementation.

What makes this device interesting is that it is an edge AI computing unit with very low power consumption (around 15–25W), yet still capable of running a local LLM.

We experimented with running Ollama using Gemma 3, exploring how a small, embedded model could be used in off-grid or low-connectivity scenarios. One potential use case is a locally deployed language model that interprets environmental data from sensors—such as soil conditions, temprature or image recognition—without relying on cloud services.

These boards are also used for robot applications and could potentially be the heart of our robot building houses in the real world.


The concrete use cases are still evolving, but the goal is clear:
to enable intelligent behavior at the edge, where power, connectivity, and infrastructure are limited.

It takes like 15mins to half an hour to resolve these promts, so we need patient customers..