The action clean-working-directory runs out of heap space while cleaning up huge working directories (30GB / 500.000 files & directories in our case):
2009-03-08 06:00:14,582 [pool-1-thread-1] INFO buildController - Initializing build
2009-03-08 06:00:15,286 [pool-1-thread-1] INFO buildController - Starting build of ivu_plan_nightly_build
2009-03-08 06:00:15,301 [pool-1-thread-1] INFO buildController - Purging exiting working copy
2009-03-08 06:00:15,301 [pool-1-thread-1] INFO buildController - Performing action clean-working-directory
2009-03-08 06:31:39,209 [Thread-3] ERROR taskQueueExecutor#build-project - Error executing task
edu.emory.mathcs.backport.java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
Caused by: java.lang.OutOfMemoryError: Java heap space
The issue seems to be caused by the implementation in CleanWorkingDirectoryAction: the FileSetManager used to delete the working directory first scans all files & directores to create an object representation of the tree and then deletes the tree by scanning through the objects. As no filter condition is required using org.codehaus.plexus.util.FileUtils to directly delete the tree would avoid these kind of problems.
I'll attach a patch based on 1.2.3.