Absolutely!!
I guess i could measure each cell after the pack has been charged to see if there are any showing slight voltage differences? This would be a fair indication of problem cells?
It is better to test the cells when the pack has been drained. The differences between the cells are far easier to detect. The cells ability to deliver energy drops off very rapidly below 1.0v. If you discharge your pack to 33 volts (1.1 volts per cell) you will most likely find that you have a bunch of cells at 1.18v and a bunch at 0.95v or less. Pull out the weaklings and test them individually to determine their capacity.. Replace if necessary. If all your cells are amazingly at 1.1v, then you already have a perfectly matched pack.
To condition the cells for use in the pack, ideally you should drain them all to 0.9v and then give them a 'formation' charge of C/10 for 14 hours to ensure they're all in-sync with each other and charged to the same level.
It is probably worth mentioning that although NiMH cells suffer no damage from being discharged to 0V, they do get damaged if a negative voltage is applied. For cells inside a pack, this negative voltage will occur if one cell runs out of charge prematurely - The other cells continue to push their energy through it creating a reverse potential. For this reason, I would recommend that once you have discharged the pack to below 36v, you switch to a low discharge rate (C/10) to finish off the weeding-out exercise.
I have read elsewhere that large NiMH packs have a tendency to fail because the characteristics of the individual cells within them change as the packs age, so even though your might start with perfectly matched cells, after 6 months you'll have a weakling in the mix. To this end, I have been wondering why nobody ever seems to use a Li-Ion style charge-balancing/cell-protection system with NiMh packs. There must be a good reason for this, but I haven't found it yet.