ive written two scripts for batch data generation. they both start w/ the word generate and are located in /cxfs/DASG/hleee/vdc.
each script generates a similarly named .out file (e.g. generate_rando.m's output is captured in generate_rando.out) and a
stats.mat matlab data file (e.g. generate.m saves its stat variables to generate_stats.mat).
note that all wavelet filters are stored in the filters.mat file and are loaded by the scripts.
they do the following:
generate.m computes the wavelet transforms of the TG data using the haar, coif6,and coif12 wavelets (here the number corresponds to the # of taps in the filter). it then denoises the resulting coefficients using the universal threshold and a hybrid SUREshrink threshold (c.f. Ogden, T. : essential wavelets for statistical applications and data analysis p.147) and reconstructs the thresholded data and stores it in a .raw file starting with coh_. it subtracts the coh_ file from the original and stores the result in a incoh_ file. to use it you must specify the variable to decompose (on line 8). resulting data are stored in generate_stats_wx.mat and generate_wx.out files for wx in this case. this is the main script that i have used to generate the vdf data.
generate_hier.m implements a level dependent hard threshold on user specified data corrupted by hierarchical gaussian noise. this technique is fairly rudimentary and theres probably room for improvement.