I was trying some simulations. I intended to save the simulated data as a .xlsx file so that each time I would load up the same data without spending time simulating them again. The data matrix was 100,000 by 1,000, and it was saved by GAUSS 15 as a .xlsx file by "xlsWrite". (There was no memory problem on my workstation to handle such a big data matrix.) Unfortunately, the Excel file could not be opened by Excel 2016/2013 (64-bit or 32-bit). If the data matrix was 100,000 by 500, there was no problem. I could have saved the data matrix as a Gauss .fmt file, but an Excel file could be more handy. So the question is why when GAUSS saves a large Excel file, the data can be corrupted and unreadable in Excel.
If GAUSS is connecting to a 32-bit version of Excel, it will not be able to write a file this large. According to Microsoft:
32-bit environment is subject to 2 gigabytes (GB) of virtual address space, shared by Excel, the workbook, and add-ins that run in the same process. A data model’s share of the address space might run up to 500 – 700 megabytes (MB), but could be less if other data models and add-ins are loaded.
A 100,000 by 1000 matrix will be about 800 MB alone. Plus the fact that Excel files hold much, much more information than simply a numeric value for each cell (font, font-color, font size, background color, formatting information, etc, etc, etc). Just as an example, the package PHPExcel states that each Excel cell that they cache takes at least 1k bytes. So even with 64-bit, this write could be taking lots and lots of memory.
In general a .fmt file will load much, much faster and require less memory. When you do want to place it in a more portable file type, you could either try writing to an Excel file in chunks, or print out a delimited text file.