Hi,
There are a few big EIS fits files which are not easy to be processed by "eis_prep", for example:
20/eis_l0_20070820_170353.fits.gz 122M
21/eis_l0_20070821_000022.fits.gz 118M
21/eis_l0_20070821_133226.fits.gz 117M
These fits files are mostly generated by study eg. "HPW001_FULLCCD_v2": doing "full-ccd" scan (with often over 80 raster positions).
The typical error message is: "unable to allocate memory to make array". However, the machine that running "eis_prep" has fairly big enough memory (4GB). One way to let "eis_prep" keep running is probably to do only DC-removal and Abs calibration, not to do CR and HP removal, :-(
Regards,
JianSun 18:53:45 22-Nov-2024 GMT
Hi, Jian
This is starting to become a big problem, and I think you're asking the right questions.
However, I don't think we should stop taking large data because the current software doesn't allow us to analyse it. This just means we need to think about . We want to take the best data we can and worry about analysing it later. You never know when an instrument/satellite can fail.
This sounds like a major issue for the team meetings in October.
In the meantime, if anyone has comments, please start making them! This is something that really needs to be sorted out.
--David R Williams, 19-Sep-2007
Hi David,
I agree with you and Harry that we should get science data first and then work out the solution, if there is problem to analyse it. So probably we need some modifications on "eis_prep" to improve the memory performance for processing these big fits files.
JianSun 18:53:45 22-Nov-2024 GMT
The Hierarchical Data Format (HDF) was made for large files. There are a lot of good tools in IDL for working with HDF files.
--KenDere, 28-Sep-2007