Return to

The Linux Classroom


Hi All! I am helping an organization launch coding classes in underserved communities. We are often working with very old hardware. In setting up our first class, we were quoted $80/machine Windows 10 volume licensing. That simply wasn’t going to happen. One, our curriculum collaborator and lead designed everything for Mac users with a terminal. I don’t like Apple products much, so I opted for a Linux classroom for our intro to Python and intro to coding classes.

I am hoping this thread will be a brain dump regarding deployment strategies for classes of ~20 clients and a local server. We are moving into Azure or AWS soon to help manage volunteers and whatnot, but I don’t think a cloud-everything solution is the way to go yet. The building won’t let us use their internet anyway. Think contained, local, inexpensive, & open-source. Think everything including resources and notes on a local server and clients that need play and me and a few staff and volunteers running around random buildings, basement classrooms, and miscellaneous offices or libraries to service a remote, isolated, set of computers.

Problem 1: Imaging full clients vs thin-clients; users are given sudo on their machine - fresh start between classes
##Current Thinking.1: PXE, but this can’t be scripted on the client, I don’t think. We’d still need to be there to boot. I’d like to be able to tell the local server to initiate the process and all clients to re-install OS on boot. Please include how much you think it’d cost. We can’t go over ~$3000 for networking and the local server itself.
##Current Practice.1: USB installation every class.

Problem 2: Backup and Storage of Student Work
##Current Thinking.2: Local share-drive and git-repo VM (important practice) with folders dumped to cloud storage between classes. I actually like this and think it will give students tools to build code and other content during their coursework with it being saved in multiple locations. ZFS on linux runs on the base OS, and GIT as well as other class services run in VMs where a volunteers or myself prepare and dump the clone onto a bare-metal drive (current and future server machines will (have) a zfs pool, bare metal VM HDD, and boot drive)
##Current Practice.2: ZFS raidz1 for 12 week course. Manual upload between classes (no automation service chosen yet - HELP PLEASE)

Problem 3: Testing, Attendance, Project Turn-In
##Current Thinking.3: It would be amazing to implement a local hacker-rank submission system. The IO capture doesn’t seem impossible, but we’d need our volunteers and myself to build the IO tests for each assignment. Deployment through a course manager like Claroline would be amazing as well, especially if we can figure out how to push from our cloud account to local servers when they are available online (some seriously will need a staffer to go in and turn their phone into a hotspot - chained down computers, locked boxes, barred windows, no joke).
##Current Practice.3: The local server has a LimeSurvey VM installed. Assignment templates and quick submissions for attendance are available on our cloud account and imported to the local VM (or the whole VDI is uploaded). This actually solves a lot of quality control issues as it will give students a basic upload survey (they can upload files for code submission), attendance (surveys for each day with a random simple puzzle or code snippet submission required), and logs client IP, custom user IDs, dates, etc automatically.) I haven’t figured out if there is a CLI for limesurvey - BASH automation in export and upload to the scoring server would be nice, but we may decide on a different content delivery system for future classes.

I’ll leave it at that! Three main areas. As I go, I’m writing setup (step-by step including CLI commands & considerations). The goal is a best-practice for rapid deployment of 10-20 clients and 1 server with consumer to pro-sumer or used server hardware. Racks are probably not an option due to space so everything server related needs to fit in at largest an E-ATX case. We (I) love building pc’s so odd configurations and case-mods are acceptable.

Also, server needs to serve mix-bag. We may be bringing a server for class purposes into a windows or Mac OS lab already built. I would still like to give students the server to play with rather than fuss with a scalable cloud solution at this time (especially since grants might cover a server we can haul around, but not irregular cloud costs).

Thank you. Thank you. Thank you. Yes. Thank you.


Also, any way we can get materials? Video tutorials, textbooks, examples, that can be relocated to the local server. Again, potentially no internet and 20 prying eyes…


Hey, this is a handful and I don’t think that an online forum is quite enough to sort out something this complex. I’d suggest get professional help if you need it.

But, to deliver some input regarding your second problem. If you were running GIT for the students anyways you could setup a scheduled task that would have a certain commit message as to make clear the commit was done by the system and have a script run a commit push when classes finish. This way everybody would have their stuff backed up in a (at least to my liking) neat and easy way.


You could use ansible or cloud-init for automated set up of a large number of machines. Alternatively, you could install linux on one of the computers in as small of a partition as possible, and just re-image all of the drives and expand the partition every time you need a fresh start.


It’s fun. It really is. We have some wonderful web architects and software engineers that will be with the students. Our collaborators are really pushing to challenge the students difficult, but manageable tasks. To be frank, though, we have looked at enterprise options. We’ve spoken with IT professionals too, and they immediately jump for CISCO managed equipment or similar. And for highschools, we have that option at nearly $2500/student. It simply doesn’t work for low income deployment.

I really like the idea of a custom commit! If each little server tags itself, we would have a neat set of submissions to compare/contrast and a wonderful cheating tool for the students! Further faster.


OMG. apt-update: true. It can’t be that easy. Thank you thank you for letting me know about cloud-init. I had been pitched ansible, but I’d rather build two VR gaming systems a year and let the students make games than pay for effectively clean ubuntu or linux mint installs. Also, we won’t be imaging all of the machines. Day 1 in all the curriculum is installing nano, sublime text, and python -why rob them of that glory. Plus, some of our donor labs will be locked into a windows volume license and then we’d have 20 computers we’re paying for but not cleaning between classes.

Why I LOVE cloud-init is that we can use it to deploy the servers itself from our cloud account. It’ll make building boxes and virtual classrooms for testing monumentally faster. Thank you, thank you.


Have you ever used Foreman (Foreman Discovery)? Baremetal installations don’t get the user inputs a virtual instance has. It seems to bridge that divide a bit.


I haven’t heard of that before but it sounds a little bit like MaaS.

  1. Just use live CD’s, and the students can store their stuff on provided USB’s. Can teach them how to mount them from CLI too if you wanted.
  2. Local git server where students can clone their repos to/from. This could just be a raspberry pi with a lard SD card. Just be sure to have good backups.
  3. Buy a used dell poweredge server for $200 and slap a half decent SSD in it. No need to fool with multiple hard drives. If the server will just be used to store git repos then it will take a long ass time to actually fill it up with source code. My old university had a VM of linux mint running which acted as a FTP server for students source code. We didn’t use git, but we could SSH and SFTP remotely. It was only 32 GB in size and after 6 years of running had only used up around half of its capacity.

If you do this on the cheap you could probably get away with only spending $1k on supplies. So long as you stick to used stuff for the bulk stuff and the only real cost will be the USB’s and the one decent SSD for the server.

Hell, if you pay for shipping I’ll just send you mine. I wouldn’t mind contributing to help students with their education.

As far as materials, I got like 18 GB of PDF’s I could send you too. But you should be able to get everything you need from here.


Interesting. I haven’t tried installing packages when using a live CD. I’ll make note to experiment with that. The students are responsible for building their own IDE. Can you add packages to the boot ISO? We could preload it with a lot of options (especially if no internet).

The Poweredge won’t fit. We need to slide the server under a small table. Worse, it will probably be on the floor. There is no rack nor ability to run cabling to a closet at any of the locations we’re looking at.

The server does a little more than store. I only want to deploy one server and it will also hold content created by the students (little videos, chats, diaries, screenshots). We are also going to open up a production VM for them to play with for video/audio encoding (they will make testimonials and small presentations for other groups and need to encode them for upload. We don’t have a cloud solution for student-accessible computational work like that just yet.

I thought of USBs too. I really thought of it. Unfortunately, we met and decided the class needs to be self contained and we don’t want to monitor a checkout system with the flash drives.

Please send the PDFs!! I’ll grab those ebooks too! Great idea. Thanks for your input. What’s the safest way to get you information? I can create an FTP account for you or share a google drive folder!

As for the server!! Eventually, we want to be able to containerize all local server activity and just launch a bunch of docker images for services (attendance/testing, production VM, PXE - sorry, still think PXE is going to work better long-term-, user management). The poweredge could definitely do all of that in spades…we just don’t have a place for it yet.


I can hardly believe something like MAAS is free. Whoop whoop. Thanks for the lead. I’ll take a look. I think I’ll put this on the test list for sure.


I recently stumbled up on PiNet and this seems perfect for your application. However, I have never used this so you’re on your own…


Thanks, @reavessm <–wow that works!.(I don’t Twitch). I’ll take a look at the documentation this morning.


Looks like you can run tftpd and dhcp off the rasperry pi without too much trouble. A pair of them is only $60. That’s a lot cheaper than the boxes I was looking at.

Now I’m looking at USB Ethernet. That’s kind of exciting. I’m not too shabby at CAD. I bet I could make a cool little enclosure for everything…with a USB fan, even.


Let us know how things go after you set it up! I haven’t seen that much news about this so I would like to hear some real-world experiences


Will do. Our first class launches this month! In a few days actually. Once the class is running and the students are programming and server v1 not failing , i’ll post an update.


So. Learned something.

Do not check a server on a place (courier service? how do you ship these things?)…even if it’s wrapped really well. Lost video out and it doesn’t have a speaker. Whoot. Off to fry’s this morning to debug.


Something to think about - Repositories. They are amazing. It is fantastic how easy apt-get and aptitude make things. However, absent the internet, and without a clear installation path for most if not all packages, it has come to my attention that there is a MASSIVE deficit in offline installation.

Looks like we’ll have to create a repository mirror prior to deployment. Trying to single-install .deb and make where necessary is just awful. Three people nearly tearing their hair out trying to install samba on ubuntu, kubuntu, and debian machines. Yes there’s a -f. No we shouldn’t have to use it. Definitely one of those things that made us realize why Windows is winning. Though bulky, windows installations often come with all necessary libraries. Though lean, Linux packages require access to a repository or you simply can’t use them or you have to install them broken.

I’d like to see ‘bulky’ .deb files.

Oh. And AT&T just might shut your internet down if you try to mirror the Ubuntu archive via FTP. Thinks your a bot and they freak out. Fun stuff. Kubuntu is really pretty and very fast on these old machines, though. We’re getting there.


Another problem well documented is the lack of driver support for USB wifi devices. We even want to do a rasberry pi net in the class. It’s looking like we’ll have to be very careful what dongles we purchase. The jurobystricky driver does not work as installed on Kubuntu 18.04 or Ubuntu 16.04.1. Sadness. Another return.


I mean… you’re criticising Linux for having lots of programs?

I get the issue though, 20k packages isnt easy to manage. For offline installs you have the dvds. which if you like you can shove on a USB drive and point your repository config file to it.

The good thing is Linux has a way of doing offline installs from trusted sources.

Once you have the dvds/archives, you can put them anywhere, external driver, computer, central server for a repo.

For a single stand alone machine you can put it on a drive that you can take away and update whenever.

For a network, make a machine a central repo for the network. It’s not hard and once done is easy to keep up to date.

This is easier than Windows which can be a bit more painful for offline updates for networked machines for example.

you have the plus since the packages are still from your distro they are still signed and trusted.

(Debian 64 bit repo for example is only 300GB)

There’s a tonne of USB wifi for rpi’s they even brand themselves as for the rpi so you shouldn’t have issues getting them for those.


I’m glad Linux has so many programs. That’s not it at all. I would argue that there is a ‘need’ for user-friendly offline installation packages without having to mirror portions of a repository. 300GB is a LOT of data at 600kb/s! It was a lesson in Linux deployment: we need to mirror repositories onto the class server prior to deployment - even for some basic services and installs. Installing direct from packages is a no-go.

What we’ve decided to do is create a VM with an apache server already set up mirroring the ubuntu repositories. We’ll just drop the VDI onto the next server deployed and config with the new IP. Then we can add the repo to the client machines for install. Yes it’s AN answer. I don’t think it’s a good one. We’re already asking to see if we can create package installers that simply come with all their dependencies. If we get that process down, we’ll script install-makers for major packages (SAMBA!!! CIFS UTILS. holy crap. Python and a few others) for instances where we need to deploy a Linux environment away from a central server or the internet (free laptops in a ghetto basement for example).

Again, nothing against Linux. It’s just a feature we want. So, we’ll see about building it.

How we got around it this time:

Phone Hotspot -> Wifi on Windows Laptop -> Bridge to Ethernet -> WAN port on router -> class computers

We will have internet at the facility we’re in now, but are waiting for a PR to go through for a cable run and AT&T to start service.