Hi,
I am interested in finding a solution to multicast a very large 21GB image onto our new G5 machines. We have got quite a few to do so ASR via Firewire is no longer an acceptable solution. Has anyone ever done this before and been successful?
Background ----------
I spoke with an Apple System Engineer in Canada and he has informed me that Apple doesn't have a multicast solution in place yet. However, the developers at Apple HQ have heard our cry all the way from here and may be working on finding a solution possibly in the near future.
Since I found this software last night I haven't had much time to play with it and workout all the compilation errors (missing and incomplete libraries for OSX). We have a Gb network for two brand new labs to be setup and running in a week. So I am really trying to find a more robust solution as the image may be modified during a term for repairs or a complete overhaul. For now our plan B is to use Carbon Copy Cloner from http://www.bombich.com, which is working out to be the only stable solution. Its a painfully slow process. One machine at a time. Alternatively, we can NetBoot a few machines at a time and use ASR (Apple Software Restore) but the disk isn't fast enough on our AFP shared drive to distribute data to more than 4 clients at a time. This process takes about 270-300 minutes to dump 21GB image on these clients since each connection is a unicast.
Any help and comments are appreciated.
I'd like to see a way that udpcast could help on the Mac. I don't know if it can be made to work outside of the x86 platform. Perhaps someone would have insights into how to make that work, but given the time line you have, it might be better to look at all options immediately.
If I were faced with one week in which to make this work, I'd consider a solution like g4u ("Ghost for Unix"), but running on Linux. I assume there are bootable CDROMs of Linux or even Mac OS available that will bring up a *nix-like environment on the G5. If you add the scripts from the g4u (uploaddisk and slurpdisk) to such a bootable CDROM, it provides for cloning that is similar to udpcast except that it is operating over standard FTP transfers.
This can probably handle a dozen or more simultaneous transfers. If you zero the unused portions of the master disk first, you can get the image down in size. In my case I have a 40 GB disk with dual boot of Windows XP and Xandros Linux on the master, and it gets down to a 3 GB image (same gzip in g4u style cloning or udpcast). When transferring something like that over FTP, the bottleneck is the client hard drive, and so it takes quite a number of FTP transfers before the network is saturated. If you have gigabit network and Gb ethernet devices in those G5s you might be able to clone a good number of machines simultaneously over the FTP based method.
I'm a big fan of udpcast, and would highly recommend it for x86 cloning, but based on the little I know about the options on the Mac, I'd seek out a bootable CDROM and try to work with the scripts, which only require dd, gzip, and ftp.
I'm not sure who makes a good live CD for the Mac - perhaps Gentoo? If you want to proceed this way, I have some scripts I've made work from Knoppix Linux for this purpose - I can email them to you.
--Donald Teed
On Fri, 27 Aug 2004, Rishi R. Arora wrote:
Hi,
I am interested in finding a solution to multicast a very large 21GB image onto our new G5 machines. We have got quite a few to do so ASR via Firewire is no longer an acceptable solution. Has anyone ever done this before and been successful?
Background
I spoke with an Apple System Engineer in Canada and he has informed me that Apple doesn't have a multicast solution in place yet. However, the developers at Apple HQ have heard our cry all the way from here and may be working on finding a solution possibly in the near future.
Since I found this software last night I haven't had much time to play with it and workout all the compilation errors (missing and incomplete libraries for OSX). We have a Gb network for two brand new labs to be setup and running in a week. So I am really trying to find a more robust solution as the image may be modified during a term for repairs or a complete overhaul. For now our plan B is to use Carbon Copy Cloner from http://www.bombich.com, which is working out to be the only stable solution. Its a painfully slow process. One machine at a time. Alternatively, we can NetBoot a few machines at a time and use ASR (Apple Software Restore) but the disk isn't fast enough on our AFP shared drive to distribute data to more than 4 clients at a time. This process takes about 270-300 minutes to dump 21GB image on these clients since each connection is a unicast.
Any help and comments are appreciated.
-- Rishi R. Arora LAN Administrator University of Toronto at Mississauga _______________________________________________ Udpcast mailing list Udpcast@udpcast.linux.lu http://udpcast.linux.lu/mailman/listinfo/udpcast
Hi Donald,
I have certainly given up on trying to find a multicast solution for this week. I am preparing for the changes that will likely come right after the school term starts in about 10 days. I can manage most of it using ARD (Apple Remote Desktop) equivalent to NetOP for PCs but still want to be able to do an image quickly if there was a need for it. For now I have a student working in the lab switching firewire cables every hour to 8 new machines so that's moving along slowly but is getting the job done.
I was thinking of doing exactly what you are suggested with a bootdisk but going the long way around it. Yellow Dong Linux is one of the most popular distributions for PPC Mac. This is my plan.
1. Install base system of Yellow Dog Linux with compiler tools on G5
2. Compile udpcast on G5 and on an exsiting Redhat Linux machine
3. Test run udpcast with these two machines
4. Make a NetBoot image of G5 machine with udpcast included (if possible) and utilize Apple's NetBoot server component
5. NetBoot a few machines and start udpcast to receive on G5
This is my theory to put togther a working solution but we'll see how it goes. One thing I am sure about is that developing a Linux based solution is likely to be quicker than trying to port it to Apple's Darwin/FreeBSD.
I am going to look at "g4u" as well and may be that's even a faster efficient solution for my needs. My image is already compressed at 21GB, which extracts to about 29GB on disk. This image has about 15 pieces of software including Adobe Premium CS and Final Cut Pro (I think) with all the media files for editing movies and graphics. Unfortunately, it really can't get any smaller than this.
I would certainly like to get hold of any scripts or tech notes that will guide me to accomplish this task.
Sincerely appreciate any help you can provide.
Cheers!
On Sat, 28 Aug 2004, Rishi R. Arora wrote:
I am going to look at "g4u" as well and may be that's even a faster efficient solution for my needs. My image is already compressed at 21GB, which extracts to about 29GB on disk. This image has about 15 pieces of software including Adobe Premium CS and Final Cut Pro (I think) with all the media files for editing movies and graphics. Unfortunately, it really can't get any smaller than this.
Just to be sure we have not misunderstood or we are missing details...
The disk on the master will not compress very well unless the unused parts of the drive have been written with zeros. You can do this from OS X as root user. For each partition containing your OS file system (not for swap) you should fill it up with a bogus file containing zeros. Zeroed raw data on the drive will compress very well.
dd if=/dev/zero of=/zero.bits
You don't need a bs or count value. Just let it run until it hits the wall and fills the disk partition. Repeat for any additional partitions. Then rm the file /zero.bits file.
You might already be doing this but I thought it was worth repeating because many people reporting large image sizes have not done this step. But in your case perhaps you really do have 20+ GB of actual data on the drive due to the media editing suites or related sample files. Sorry if I'm repeating the known info, but I didn't want you to miss this.
--Donald Teed
Rishi R. Arora wrote:
I am interested in finding a solution to multicast a very large 21GB image onto our new G5 machines. We have got quite a few to do so ASR via Firewire is no longer an acceptable solution. Has anyone ever done this before and been successful?
Well, no. However, I suspect you can get a Linux machine running on your network with the image you want to udpcast. Gentoo seems to have live CDs available here:
http://gentoo.osuosl.org/experimental/ppc/livecd/g5/
You may have to compile the udpcast client for PPC and put it in the image (or copy it onto each machine after booting -- yeech). Then it should be a straightforward 'dd' to /dev/sda (or whatever).
I don't think Linux can mkhfs+ so you're probably stuck with dd(/gzip). Otherwise, you could create the filesystem, install the bootloader, and then udpcast and unpack a tarfile.
rgds, Chris
Hi Christopher,
Thanks for the tip on Live CD for G5. It will surely save me some time preparing the custom boot disk I was going to make for the G5 machines. Since I can have a headstart with this CD, I can work on compiling udpcast on a G5 machine.
Cheers!