AIX jumpstart: day 3
Oct. 22nd, 2008 09:48 pmFinished off the last few bits and pieces of the LVM topic this morning and did a series of labs that involved setting up new volume groups and logical volumes, fiddling with existing volumes and setting up mirroring on the root VG. All this was scheduled to take around an hour but, needless to say, it took me about 10 minutes to complete, whereupon I spent the remaining time reading the paper, doing the crossword and generally feeling extremely conceited.
The afternoon was taken up with a whole load of stuff on JFS and JFS2, including a series of experiments showing the effects of the various tools on file system creation and destruction. After going through the first set of slides on paging and virtual memory, the day rounded off with a few interest tips on AIX performance monitoring tools including a mention of the rather useful sounding nmon.
The day's highlight was a slightly off-topic discussion of fragmentation of Unix/Linux file systems. The instructor asked whether, when using the standard defrag method of
In other slightly off topic news, I noticed on the way home that I seem to have picked up a course related injury: having spent the week writing more in longhand than I've written since since I was a student, I've notice that I've developed something that looks suspiciously like a blister on my middle finger. I wonder if Heath and Safety covers injuries due to excessive writing...
The afternoon was taken up with a whole load of stuff on JFS and JFS2, including a series of experiments showing the effects of the various tools on file system creation and destruction. After going through the first set of slides on paging and virtual memory, the day rounded off with a few interest tips on AIX performance monitoring tools including a mention of the rather useful sounding nmon.
The day's highlight was a slightly off-topic discussion of fragmentation of Unix/Linux file systems. The instructor asked whether, when using the standard defrag method of
dump – restore, it was necessary to delete the contents of the file system prior to recreating the data. I suggested that the answer might well depend on the nature of the fragmentation involved. For, if the data was fragmented across a number of data blocks across the system, i.e. external fragmentation, then it might be necessary to re-make the FS in order to ensure that the new data was closely packed across the disc. But if the fragmentation involved wasted space in disc blocks on a system that supports partial block allocations, i.e. internal fragmentation, it may simply be enough to restore the files and let the file system deal with the packing itself. But the more I think about my answer, the less sure I am about it.In other slightly off topic news, I noticed on the way home that I seem to have picked up a course related injury: having spent the week writing more in longhand than I've written since since I was a student, I've notice that I've developed something that looks suspiciously like a blister on my middle finger. I wonder if Heath and Safety covers injuries due to excessive writing...