Deep Zoom Images are accessed using standard client-side http protocols and a file store for the XML and image files. The generated image tile pyramids can create a lot of data and corresponding folders. Even though the tile pyramids only take up a small amount of extra disk space, a collection of 600 6-megapixel images will still generate over 140,000 files in more than 14,000 subdirectories. Moving such a large number of files can be challenging, but there are a number of techniques that can simplify deploying to a web server.
Generate in Place
The easiest way to deploy your DZI is by generating your image tile pyramid directly on the production server. This saves the effort of transferring content to a hosted location, though it is best suited for well-established production environments.
A more typical implementation involves staging and deploying the source images, or a manifest linking to them, and then running a production server process to create the DZI in place. Another option is to run the content creation process on a local machine and save directly to the target deployment server.
This technique is not appropriate in cases where it is not feasible to run the content creation process on the production server, when it’s required to pre-stage content for test or verification prior to deployment, or when significant replication beyond the origin server is required.
Various Deep Zoom
content creation tools provide flexibility in setting the dimensions for image tiles. Using a larger image tile size can reduce the number of files required, though depending on the content type, this may create an undesirable increase in tile load latency. For efficiently compressed images, larger tile sizes can significantly reduce the number of individual files and directories required.
Generate to ZIP
A custom work flow can generate DZI content directly to a ZIP file, which even with the process on the production server to unpack the ZIP file, reduces the overhead of deploying a large number of individual image files.
When there is no choice but to copy the individual files to a deployment server over a network connection, there are a few techniques that streamline this process.
Windows includes ROBOCOPY, a command line file copy utility, which is the recommended tool for copying large numbers of files and directories locally or across network connections. ROBOCOPY takes full advantage of all system and network resources for multiple copy threads, and has great options to mirror entire directory structures and intelligently restart a previously interrupted copy without the need to retransfer unchanged files.
If both the source and destination computers are running Windows Vista, Windows 7 (SP1 or later), Windows Server 2008 or Windows Server 2008 R2, full SMB2 support is available. In this environment, the built in file copy features in the Windows Explorer are also very efficient that was the case with previous versions of Windows or when copying in a heterogeneous operating system environment. While ROBOCOPY is still the preferred too, this can be handy for quickly copying smaller Deep Zoom images.
Performance may also be significantly improved by temporarily disabling anti-virus software, eliminating unnecessary processing of the individual files being copied.