Hi,
There are a few big EIS fits files which are not easy to be processed by "eis_prep", for example:
20/eis_l0_20070820_170353.fits.gz 122M
21/eis_l0_20070821_000022.fits.gz 118M
21/eis_l0_20070821_133226.fits.gz 117M
These fits files are mostly generated by study eg. "HPW001_FULLCCD_v2": doing "full-ccd" scan (with often over 80 raster positions).
The typical error message is: "unable to allocate memory to make array". However, the machine that running "eis_prep" has fairly big enough memory (4GB). One way to let "eis_prep" keep running is probably to do only DC-removal and Abs calibration, not to do CR and HP removal, :-(
Regards,
JianSun 04:55:08 26-Nov-2024 GMT
Hi, Jian
This is starting to become a big problem, and I think you're asking the right questions.
However, I don't think we should stop taking large data because the current software doesn't allow us to analyse it. This just means we need to think about . We want to take the best data we can and worry about analysing it later. You never know when an instrument/satellite can fail.
This sounds like a major issue for the team meetings in October.
In the meantime, if anyone has comments, please start making them! This is something that really needs to be sorted out.
--David R Williams, 19-Sep-2007
Hi David,
I agree with you and Harry that we should get science data first and then work out the solution, if there is problem to analyse it. So probably we need some modifications on "eis_prep" to improve the memory performance for processing these big fits files.
JianSun 04:55:08 26-Nov-2024 GMT
The Hierarchical Data Format (HDF) was made for large files. There are a lot of good tools in IDL for working with HDF files.
--KenDere, 28-Sep-2007
Hi, Jian,
This problem arises from the limited memory size that IDL can allocate. IDL running on a 32-bit computer can in principle allocate memory of about 2.9GB. However, according to my experience, it can only allocate memory of a litter more than 1GB, which is not enough to process such a large EIS file with EIS_PREP.
Nevertheless, this problem can be easily solved on a 64-bit computer which running the corresponding version of IDL (also 64-bit). Of course, the computer should have large enough physical memory (maybe 4 GB or more). In this case, IDL can allocate large enough memory to run EIS_PREP to process the EIS data of large size. I have tested it using the file eis_l0_20070824_001531.fits.gz. I processed it with EIS_PREP successfully.
Another way, I guess, to solve this problem is to optimize the IDL code EIS_PREP and related ones, in order to reduce the memory requirement. This may be done by freeing the allocated memory in the code immediately after using it and reducing temporary variables if any. I am thinking about this because I met the same problem when I processed several EIS files with smaller size one by one successively. Therefore, I guess that when we call the code, the allocated memory in the run of the code is not freed after the call.
--HuiLi 02-Oct-2007