Inconsistent behavior of old DOS SCP client between different Linux OpenSSH servers

Hi all, I’ve got something of a headscratcher here. I’m trying to get an old DOS machine (yes, really) to download files via SCP from an Ubuntu 20.04 server. The basic idea is that the DOS machine needs to connect via SCP and download all the files that are in a certain directory.

The client does connect, but it only downloads one file before disconnecting. The crazy thing is that this works fine when the server is a RHEL 7 server with OpenSSH 7.4, but I’m trying to migrate this over to an Ubuntu 20.04 server with OpenSSH 8.2. Without touching any of the client’s settings, I can use a domain alias to choose which server the client is connecting to. So in other words, the behavior is different with the same client settings. Because of that I’m pretty convinced there’s a problem on the server side. I just can’t figure out what that problem is.

The sshd configuration on both servers is practically identical, with the exception of re-enabling some older algos on the newer server. Setting LogLevel to debug3 didn’t yield anything useful.

I compiled OpenSSH 7.4 and got that running on the new server, but the behavior I observed was still the same. I thought maybe the security-related changes to SCP in newer versions might’ve been the deciding factor, but it doesn’t seem that way now.

I’m really at a loss as to what the cause could be. Any ideas?

Is the DOS scp saving a host id somewhere - if you’re redirecting it with DNS I’d have thought it’d be ‘upset’ with this- by not mentioning it I imagine it’s too basic a program to care, or you resolved it. If not, perhaps it’s involved somehow.

Wholly speculation, but if you asked for ideas - how are you getting the ‘all files’ from the client end - can the ssh user r-x the actual directory they’re in on the new server?

Can this be replicated with the same scp program/bat file I guess from say, dosbox?

I’d not be untempted to cron for a job to tar the directory every 5 minutes, from the server and clean up any n-2 older tar files. (I do not, of course have a suggestion what you’d untar them with, perhaps freedos has something).

This is fascinating…

May I ask more about the use case and about the TCP/IP stack in use?

The DOS client is too primitive to care about such things, so that luckily hasn’t been an issue.

It uses a wildcard, so it’s something like username@host:/path/on/server/*. That’s the gist of it, though I don’t remember the exact command right now now, but I can dig it up tomorrow.

The directory path, ownership, and permissions are identical between the two servers, so the SSH user should have the same access.

That’s an interesting idea. I haven’t tried it, but I’ll definitely see if I can get that going. Thanks for the tip!

I can’t be too specific, but basically the place where I work has an antique CNC machine that can’t be replaced because it’d be too expensive (believe me, I asked :smiley:). You know how in Warhammer 40K there are pieces of forgotten technology from millennia ago, and no one who is still alive knows how any of it really works, but there are techpriests who know the right rituals and incantations needed for their normal use? It’s like that.

The control computer inside the CNC machine runs DOS, and the workflow is that a technician will use their workstation to send their CNC program files to the server, then go to the CNC machine and pull the files down from the server by executing a batch file before starting the CNC job itself with a further series of batch files.

I honestly couldn’t really tell you anything useful about the TCP/IP stack, at least now off the top of my head, other than that the batch file begins by loading some sort of TCP/IP drivers into memory, and then unloads them at the end. I’ll see if I can dig up anything more specific.

3 Likes

Lol you won the internet today, and made my day with your most excellent analogy, good sir.

Is the US government is waking up its bunkers and missile silos that cant functionally be “hacked”?