Need advice on setting up a data backup server for imaging center


New Member
Jun 15, 2015
I work in an imaging facility with confocal, super resolution and electron microscopes. We generate lots of data each and every day.
The problem we face is this:
Users want use to hold on to and store data. In some cases for many years.
We currently have no real data policy, but working on one, (that's one reason why I'm here)
We already have years of data backed up, but running out of room fast. (oldest data from 2007).
Our goal is in three parts: One: provide a reliable way to transfer data for users, (avoiding the use of USB drives), so they can retrieve it at a later date.
Two Back up this same data to a server for longer term storage ( have not worked out a time limit)
Three Back up important data that we (the people who run the facility) are directly involved in for publication, in this case we want to back this up for many years (10+ or forever).
We use the Netdrive software/protocol (WebDAV) to transfer files for users, We have a 30 day time limit with auto delete to keep this disk clean. total size is 6tb. There are about 200+ users transferring data back and forth at any given moment.
We have a Backup server (21TB) setup for long term storage for the users. This server is kind of disorganized. (might need a little advice on how to keep it organized). It is Also used for our important data (published) that we need to keep forever.
The problem is we are running out room, and ideas on how to manage the data.
The Data:
Data collected on a confocal microscope can range in size from 1 MB - 20 GB , some users can collect 10 GB in three hours in a range of sizes and a given folder may have 50 or more images in it. On the other hand a users can collect a 15 GB image in about 2 min. Some data is big in a folder while others are just small files adding up to a big folder.
The super resolution microscope has and average file size of about 9 GB. But again these image can range up to 40 GB.
the electron microscopes tend to have smaller files. Some range 1 MB up to 1 GB. Depending on the resolution and size.
What we run into is all the software for the microscopes are Windows based. The largest files size we can transfer with a WebDAV in windows is 4 GB. However somehow Netdrive gets us past this limitation. We want to move away from Netdrive but want to know how to get past the limitation of 4 GB.
Currently I have to manually transfer data from each computer to our back up server in order to clean up the built in data disks. The data disks on each computer is 2 TB RAID.
Is there anyone out there who can give me advice on how to transfer more efficiently? Also if we build a server out of a few old computers can we use it for our important backup data.



Staff member
Oct 31, 2009
WebDAV is not the best way. It is an HTTP protocol and not designed for larger than 4 gig files. In fact the protocol has a limit of 2GB, but in actual use 4GB is largest.

My suggestion would be secure FTP in combination with power shell scripts to automate the process for your users. You could set them as tasks or have the user start the script when they need to transfer the files.

I transfer 100GB+ files all the time and FTP is the best way I've found.
Top Bottom