Jetson Nano Super - Plus Giveaway

Video

Background

We were experimenting with the Jetson Nano for RAG, LLM and… optical recognition? Yes, built a dice roller robot thingie for Steve. It can run on the Jetson! And the new Jetson Nano Super is 25w unlocked. With more memory bandwidth.

NVIDIA just launched the Jetson Nano Super with some big chances – the biggest one – aggressive pricing.

NVIDIA has also committed to keeping the supply of these modules alive for 10 years for the reason that they can be incorporated into industrial and at-the-edge designs.

Future Projects

What do you want to see? I am thinking home assistant with conversational capabilities and able to execute node red actions. An agent that can direct other agents (such as the agent running coral ai modules on the Frigate DVR system). Retrieval augmented generation based on ai generated transcripts of security footage? Now accomplishable in a saturday afternoon.

More Info

The Giveaway

Click the link above and check it out – reply below and answer –
What would you do with a Jetson Nano Super??

Replies will be selected at random to win the Jetson Nano Super.

Thanks Jensen!

Heads up, we’re picking a winner the end of the week – this friday 12/27!!

52 Likes

This is very nice. I’m excited to see the ampere cores. I’d like to mess around with it to see what kind of small llms I could get running locally on it. I’d see how many tokens per second I could do with it.

1 Like

That’s so nice, thanks!

I’ve been eyeing one for quite a while, but never had the courage (aka spare money) to justify buying one.

What I wanted to do with it long term would be using it as a “brain in a jar” for my home so plugging it into Home Assistant and using it to help me stay on top of life with some “thoughtful” automation.

Short term I wanted to experiment with audio and video softwares to transcribe and differenciate between multiple talking people real time, and find some way to analyze guitar tones off tracks and try to replicate the signal chain.

About video is just basic computer vision: I’d like to put together a system that scans my groceries, recognizes the expiration date or sets one on fresh foods based on given parameters and keeps track of those details somewhere.
Or I’d build a personal “street view” device to incorporate with my sub/wardriving setup. Basically snapping a 360° shot matched with captured signal and figure out on board where it might come from. For example associate a gate with the captured signal.

There is so much I’d like to try I’m sure it would only be a matter of time before even more ideas pop into my head.

2 Likes

I have the non-Super Orin Nano (along with the original Jetson Nano 4gb) running some small models/LLMs.

I currently have some models running in my desktop just for inference such as Whisper for transcription and a multi-task learning CNN model related to my masters that I’d love to off-load to a smaller device with lower power consumption, that Orin Super would be perfect for that (specially given how it shares the same GPU Arch as my desktop).

It’d also be really nice to see if the “Super” changes are indeed just related to clocks, and if pushing a non-Super model to those higher clocks could close the gap between them.

Ideally I’d also shove those into a local K8s cluster, but getting MPS to run in those tegra devices is not that straight forward.

2 Likes

I would use it you start dipping my toes into the AI with analyzing sounds and images.

Honestly, I know this is a little baby board but I think it’ll be an interesting addition to see how it improves my setup for distributed inference with Exo (GitHub - exo-explore/exo: Run your own AI cluster at home with everyday devices 📱💻 🖥️⌚).

3 Likes

Facebook honestly doesn’t get enough credits for opening up their Llama 3 model and it looks pretty stellar for what it is - A LOCAL model.
That would be my first experiment with it.

2 Likes

At the risk of being extremely boring, my only current plan would be to put it in my tailscale net and use it primarily to run LLMs so I don’t have to close GPU-accelerated applications like CAD (and access it remotely). But I’ll totally steal other people’s ideas to supplement this :b

I’d like to make a smart bird feeder that can use image recognition to identify birds. I also would like to hook up a Ollama to home assistant.

2 Likes

Future CS student and current programmer here:
Would love to use this as a learning platform.

I would be interested if I could use it with a spectrometer to make a super cheap optical coherence tomography system for detecting retinal disease. It could run the segmentation algorithms onboard for near real-time AI-driven assessments of retinal layer thicknesses.

1 Like

Would use it as a starting point into machine learning.

I would love to use this to run an automatic photobooth at out local festival. Teach it to recognize when people step in. recognize things it shouldn’t post and post the rest to our soial media feed automatically. it’s the weeding process for undesirable content that would need the most processing/fine tuning.

1 Like

Wendell, Thank you for the opportunity. I would probably use it to upgrade my Home Assistant/Security system for Vehicle/Object/People recognition.

I have been trying to buy few cheap microcontrollers (arduino clones) to learn how to program them with the hopes that I will be able to build a big biped robot that can be controlled by a human operator.
ATM I am doing practice repairing some old gpu’s and saving money here and there to be able to buy tools and development boards I was also hoping to buy a Raspberry PI 5 so I can use it as the brain for the robot, so it can do stereo vision to have a feel of depth and determing how far an object is from it.
But with this Jetson I could do so much more, object recognition and object follow, various AI workloads to make a controler imput move an arm or make it move forward based on ML walk system and so on.

I Wish good luck to everyone!.

I would plug it into a power monitor and run the smaller llama 3.3 model and test out that power consumption. Afterwards I would say “Thanks Steve”

I’d like to try out some LLMs and see about integrating it into Home Assistant.

I would use something like this to prototype some ideas I have for telescope control and data collection. Having access to some onsite ML for smaller remote telescopes could be really helpful for improving data processing, calibration, and tracking.

Evil (home) assistant … everybody talks about Jarvis … i realy prefer Igor assistant like in brainscan, pretty dope for 1994.

Assistant asside, would really like to play with the image recognition, maybe to see if i could detect the neighbors cat when he comes pooping on my door rug …

Love the channel :wink: peace

What would you do with a Jetson Nano Super??

I’m a CS student and my next semester is super focus on computer vision, so I would love to use something like this for my next projects.

1 Like