Minisforum AR900i mobo with i9-13900HX with GPU-assisted Global Shutter (GS) camera sensor rapid motion microscopy machine vision

Tangentially continuing the discussion from Minisforum BD770i Troubles [Solved]

I will apologize in advance for wasting anyone’s time with this speculative inquiry about the AR900i mobo … but the bd770i mobo discussion was just riveting … especially the “We no support that” transitioning to
“Here link to firmware.”

Such an UPLIFTING story!

Deep in my heart, I do believe we shall overcomb one day.

2024 will be a year of more overcombing … so this is about nanotoolworks machine vision science experiment to really get down to sciencey hobbyist things, like extending microscopy.

Are there little AFFORDABLE-FOR-THE-MASSES microscopy projects, for cellular and molecular-level digital optical microscopy, in case families want to share videos of their gut bacteria or new virus friends before/after the holidays on Instagram

Pondering my hellacious hardware HOBBYIST hallucinatinatory pipe dream prompts me to ask one of those annoying Is there a better way to do this than what I’m thinking about doing it? forum questions … somebody has to have done this [at the hobbyist level] … maybe not quite yet, but I’m not the only one thinking about this, right?

I am interested … because of @wendell’s BD770i video … but the Minisforum AR900i mobos I’m eyeing have not shipped yet … the reason for opting for the AR900i mobo was that cool, AFFORDABLE, soldered-in i9-13900HX and all of the fixins on the mobo … just for the VALUE of everything Minisforum packed into it, ie why can’t other mobo mfrs do it like that? Such a nice little starter package?

Maybe later some day, I can add a NVIDIA 4090 or 5090 or even 6090 GPU, or maybe something even better from AMD next gen … or maybe something else that becomes available in a couple years or when prices come down so that TOPS per $ AI value goes up even more … but I do applaud Minisforum for getting into mobos!

In a nutshell, I was thinking of the AR900i mobo as the starting point or basis for all kinds of different little GPU-assisted things to be added later, in different kinds of things play with … like life-sciences microscopy and image streaming, just like those gamers folks do.

The point of ONE particular little GPU-assisted microscopy project … would involve setting up the AR900i for a machine vision application … in which a GS camera sensor is mounted in a modified structured light epifluorescent microscope … the objective would be achieving something like super resolution computational saturated absorption microscopy or maybe GPU-assisted total internal reflection fluorescence (TIRF) microscopy.

It suppose that I could do this different ways, such as:

Although, there’s nothing particularly magic (see next paragraph) about the Arducam 2.3 MP GS sensor … I thought it might be practical starting point … but I think that high-speed photography will necessary in this GPU-assisted application … so I think that translates into using a Global Shutter (GS) camera module which captures the light from every pixel in the scene at once to avoid the kinds of distortion that a Rolling Shutter (RS) module produces … so, my understanding is that it’s sorta [in a really hand-wavy sense] the same strategy used minimizing distortion when capturing the least-noise, but raw data ultra-HDR images which are used by the current AI-phones to render high quality digital video.

Other Global Shutter CMOS Sensor Options:

  • Sony: IMX296LQR-C (1.6MP), IMX412 (5MP), IMX540 (9MP), IMX671 (12.35MP), IMX686 (24.2MP)

  • ON Semiconductor: AR0330CS (3.4MP), AR0521CS (5.2MP), AR0820CS (8MP), X-IMX385 (12.3MP)

  • Canon: CMV500 (2MP), CMV4000 (12MP), CMV4100 (15MP)

  • OmniVision: OS04A10 (4.1MP), OS08A10 (8MP), OS20A10 (20MP)

  • GalaxyCore: GC2145 (5MP), GC462D (12MP), GC5035 (16MP)

  • Aptina: AR0832CS (8MP), AR1920 (19.2MP)

  • e2v: CEMOS Sensor Technology range (Various resolutions)

  • Teledyne DALSA: Falcon Series (Various resolutions)

Possibly relevant Time of Flight (ToF) Sensors:

  • AMS: TimePix3 (5MP), TimePix4 (32MP)

  • Sony: DepthSense ToF Camera (5MP)

  • STMicroelectronics: VL6180X (5MP)

AGAIN … I apologize in advance … just in case, anyone is offended by mixing life science machine vision topics like structured light epifluorescent microscopes in with the more purely hardware-ish topic of motherboards that have not shipped yet.

I know, I know … we no support that! … but, that is what motivates the need to ask…

1 Like

I suppose that I really understand the persistence of interest in computer gaming or gaming rigs, ie, gaming’s fine … a person can be interested in anything they want to be interested in … but, at some point, the funds for radically better gear to indulge the interest in sport runs out – why not work on something FUN that might generate some real money to pay for indulging the sport of the play?

A person could be monkeying around with things like different tricks to leverage generative adversarial networks to create realistic scanning transmission electron microscopy (STEM).

It’s NOT about having the STEM gear sitting on your bench at home … it’s about the SPORT of deploying complex reasoning and experimental techniques to solve a particularly difficult problem … it’s like a hybrid intelligence game, ie computers / AI can do parts of it, but other parts of this hybrid intelligence game involve the creative human playing with ideas.

Different games have different rules/constraints baked into the game … in this game, one rule is that, simulated STEM images deviate non-trivially from REAL experimental images … it is extremely difficult to accurately reproduce complex experimental factors that impact the image, including detector noise, sample drift and scan distortions, time-dependent alignment errors, radiation damage, and surface contamination. This makes it close to impossible to train high-accuracy ML models with simulated data … usually considerable manual optimization to generate a usable training set for a specific experiment data set.

Philosophically, we can ask, “What’s the point of our tech anyway?” This is a REAL question to ponder … unless we’ve just become disposable gadgets.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.