Server for a publishing house?

Hi everyone,

I like to think I'm quite good with computers and linux for the stuff I do, yet yesterday I got a task from my boss at a publishing house. This is my problem:

He employs about 80 people. All of them, over the last 13 years, amassed only about 2GB of data for work. That's extremely little However, he always gets people telling him two days before deadline that they 4 months worth of work. He's pretty upset about that. I told him about my linux setup (every 10 minutes rclone uploads all my work to GDrive and downloads new stuff back again) and he was thrilled by that. At the end of our talk, he has simply told me "Get me a list and a budget and set it up."

Here is the real data: 99% of the people work from home, so I need them to access the server over the internet. I need it to be (at least on the client side) as idiot-proof and low-maintenance as possible since most people working there are writers, some of which are in their 50's and have never touched a computer beyond opening Word. A simple network drive solution would be optimal, so that they can transparently upload the data to the server. And I'd love to do this with a linux machine so that I have a familiar SSH tunnel and set of commands, optimally debian, but not limited to that.

Has anyone done this sort of setup before? I would love to run some tests before I hand it in. What protocol can I use to set it up as a simple network folder? Is something like this possible over the internet? I'm not sure Samba would be able to do this. I'm also looking at individual access to files, but that's not such a big issue as of yet, I'm looking at the connection to the system itself now. The workload really is minimal, it's just about uploading and downloading a couple MB a day maximum.

What would you use for such a workload? I've found a small HP server, some 1TB disks (he might want to upload some more stuff on there, who knows) and 2 flash drives. I would like to set it up so that the disks are always up-to-date and receive data, but e.g. on every monday and thursday, the device mounts one of the flash drives, copies everything onto the clean flashdrive and unmounts it. This way I have something for backups in the case of physical damage to the HDDs (since those flash drives are solid aluminium and only about 5mm over the edge of the port). The last thing my boss was thrilled by is that my PC is set to send me e-mails in case something goes wrong. He'd love to receive weekly updates on all the non-syncers and be able to keep tabs on their work.

Is there some special kind of software I don't know about? Otherwise I'll just make accounts, set up an SFTP server, send out e-mails with details about how to set it up on the client side and every person gets 50MB of storage. Done deal. Twice a week a cron job copies contents to USB and everything's dandy. Is there a solution?

I feel like we're missing something here. They're 4 months worth of work behind? Just behind in uploading to the existing server?

The whole post makes it sound like the problem is people not getting the work to the server in a timely fashion. If that's the case, then what you want to look into is what's keeping them from doing this. Is there something about the existing setup that takes them well out of the way of their day-to-day workflow? Is there just no server at all to begin with, and all of this data is just spread across the employees' desktops/laptops?

Otherwise this seems like a pretty cut and dry VPN+FreeNAS setup.

This.

People are just blaming technology for not doing work.
Even if you eliminate roadblocks, they'll just shift blame; (computer won't boot, power failure, kid/dog broke computer)

With 80 people, doesn't this guy already have some sort of infrastructure? 80 is a non-trivial amount of people.

Some kind of analytics has shifted fo sho, suddenly there's a lot of new people asking the most random things.
But to answer the question google docs is a thing, hate to be blunt but it's all I got atm.

You can get dropbox service for companies, quite cheaply...

other approach set up a squid server, and refresh its caches or so often... you can use it with any kind of torrent file sync software like Resilio. This way it'll work just like torrents; with no maintenance at all... (squid caching is optional but regardless its very nice.)

With Resilio alike software, it doesn't matter whats the connection, in/out as long as any host is up.

The problem is they're working externally. They translate books and are due to post the stuff before a deadline. The bad thing is they're not doing regular backups, so my boss wants to provide a backup solution for them and he also wants to keep tabs on people not being behind a "schedule". But tbh there is no schedule, people are asked to hand the whole book in once it's done. This would simplify matters for them and make these "I didn't backup for 4 months" problems less frequent. That's what I'm aiming for.

The squid option looks really nice, can it set a certain folder as an "always torrent to server" folder? Like whatever you put in it will get beamed? I've created a couple .torrent files in my younger days and it seems like a whole lotta work for a 50yo philosopher.

I'm not sure how the torrent solution would work, but I'm not really big on torrents so that's just a gap in my knowledge.

If getting the work sync'd in a timely fashion to your server is the solution you're looking for, rsync is also a thing. For Windows there's an application called Synctoy which I believe will allow you to schedule the syncing of a directory or multiple directories.

For extremely small amounts of data like in this case, this would be a pretty good solution. The big downside, which you'll have in Windows with almost any client backup software, is making sure that at some point in the day, the files are closed so a good backup can be made. But otherwise you could have Synctoy pointed at a work directory in their Documents directory, and have it sync to a directory on the server. Clean, simple, a does a wonderful job of covering most of the work.

The rsync and SyncToy might be great for this purpose, however I don't know about the cross-platform capabilities. I've contacted Resilio about their corporate plans (although they seem expensive as hell) and also looking at rsync and Synctoy in conjunction. However I'd still face certain problems: I want the users to only have access to a single folder on the filesystem and for that, Resilio seems like the best option. I set it up 5 minutes ago for my desktop and phone and it was a breeze, I can imagine even my mother accomplishing that. The rsync and synctoy look like more work in the beginning, however the long-term maintenance might be better. That's why I asked if someone did this already so that they could tell me what they tried, what works for them.

I'll play around with all of these and see what works. So far I'm really tempted to do a Resilio free to each folder on a drive with daily, weekly and monthly copies to a secondary hard drive and a flash drive and give a key to every employee in the company to their folder.

The problem with the closed files is a tricky one, although it may be solved if I set up a network drive for each person. This might be a little taxing on the bandwidth, since they won't be editing files on their computers and also I wouldn't have the redundancy I need.

You could just get a Nextcloud setup running on a server you build for yourself, buy a domain name, and then have everyone sync to that.

Nextcloud seems fine, but does it have something that'll backup the files from individual PCs automatically? Because that's the point I'm most worried about: The end user. They are (as I've said) too head-in-the-clouds to realize the importance of backing up stuff.

I'll take a look at it deeper, thanks :-)

they download the nextcloud client for their computer, and then it syncs when they make changes. Its literally the same thing as dropbox but you host it yourself.

it backs everything up that they stick in their nextcloud directory, so the only caveat I can think of is to make sure they are working the appropriate directory; the rest being automatic.

1 Like

Sounds great, I'll have a look at it.

You can also use the client to sync specific folders on your devices, I have mine setup to sync backups of my /home directory and some other stuff.

You could use “syncthing” and a shared folder per employee.

For redundancy, you can have 2 servers (one at the office, another could be from lowendbox.com).

You would then do backups from the servers as normal.

You mentioned email, who hosts the mail servers for these people?


Ugh just noticed the necro bump, sorry.
OP, if you’re still reading, what did you come up with?

Hi, I’m still reading.

Syncthing might be nice, I’ll give it a look. My boss still didn’t realize how much it’s worth it.

He’s running two synology servers, one at the office and one at work. Storage is not an issue.

Still looking into it, not willing to give up.