Archiving advice
We are looking to tidy up and archive old files off a windows 2003 file server file share. There is over 100GB worth of stuff. Its in quite a well structured file structure, thus that the users know where to look for data. But some of it wont have been
accessed in years. I could do with archiving old stuff but maintaining the folder structure in the archive. And also some advice on where to archive to (what type of media) so it can be accessed quickly if ever needed. Obviously taking a copy and putting it
on another server defeats the object of tidying up old files, but deleting this information is going to cause issues as from time to time the old files will need accessing. If you have done a similar exercise any input welcome.
April 24th, 2012 9:14am
Hello,
I think it will be better to ask them here: http://social.technet.microsoft.com/Forums/en-US/winserverfiles/threads
This
posting is provided "AS IS" with no warranties or guarantees , and confers no rights.
Microsoft
Student Partner 2010 / 2011
Microsoft
Certified Professional
Microsoft
Certified Systems Administrator: Security
Microsoft
Certified Systems Engineer: Security
Microsoft
Certified Technology Specialist: Windows Server 2008 Active Directory, Configuration
Microsoft
Certified Technology Specialist: Windows Server 2008 Network Infrastructure, Configuration
Microsoft
Certified Technology Specialist: Windows Server 2008 Applications Infrastructure, Configuration
Microsoft
Certified Technology Specialist: Windows 7, Configuring
Microsoft
Certified Technology Specialist: Designing and Providing Volume Licensing Solutions to Large Organizations
Microsoft
Certified IT Professional: Enterprise Administrator
Microsoft Certified IT Professional: Server Administrator
Microsoft Certified Trainer
Free Windows Admin Tool Kit Click here and download it now
April 24th, 2012 9:21am
It really depends on what level you currently have and what your aiming for.
One top level approach could typically be alot of different files stored on a SAN. Usually a SAN is devided into fast and slow storage, with the fast storage being alot more expensive. (Or it could be SSD and SATA).
Then you would like the files that are not accessed over a period of time to be moved from fast storage, to slow storage - basically making the cost per GB stored go down. After a period of time on the slow storage, the files can be compressed on slow storage.
The cost being the same, however the amount of files you are able to store goes up. Eventually you can decide if you want to delete the files or just keep them compressed.
The implementation I have been part of utilized the CA File System Manager solution, former Arkivio product. Which basically crawled all content, indexed it and moved it according the a set of policies, basically leaving behind a shortcut to the file.
For 100GB of storage, the need for an automated archiving solution is very little. Using a standard compression tool, would give you some of the space back, with the cost of increased access time. Again I would simply do this manually. As
its very obvious when a file is zipped or not. I am not aware of any cost effective tools on that amount of storage, which would give you any benefit.
April 24th, 2012 9:34am