3

I am doing a lengthy SageMath calculation in Terminal that I plan to save the results of to disk using Shell > Export Text As... The resulting text file will be somewhere in the neighborhood of 10 GB.

Do I need to worry that 16 GB of RAM will not be enough for this task? In particular, I worry that both SageMath and the Terminal display are storing the results separately. SageMath's format would require less space, but by how much I don't know.

Another question: if I try to use “Shell > Export Text As…” while the calculation is running, will this cause it to stop? I'd prefer to not have to start over after three days just in order to find out that the answer to this question is "Yes, it will stop!"

I expect the calculation to take about two weeks total.

Giacomo1968
  • 53,069
  • 19
  • 162
  • 212
  • 6
    “I expect the calculation to take about two weeks total.” Wow! Figure out how to redirect that output to a file instead of “Export Text As...” [The answer posted](https://superuser.com/a/1652823/167207) suggests this and I say that is truly the best solution to this issue. – Giacomo1968 Jun 01 '21 at 01:03
  • 1
    The calculation is easy to describe: for each p in Posets(11), print all covering relations. It takes a while to generate 46.75 million inequivalent posets. – mathematrucker Jun 01 '21 at 01:55
  • 1
    Since I am going to start over with a standalone script per the answer below, I just did "Shell > Export Text As..." to see what got generated so far. (This did not stop the calculation.) After about 72 hours, 9.5 million posets were generated with a file size of 1.2 GB. If this rate continues the calculation will take about 14 or 15 days and the file shouldn't exceed 8 GB. My previous guesstimate was based on how fast the posets were scrolling in the Terminal window. – mathematrucker Jun 01 '21 at 02:08
  • 4
    Well, whatever you are doing it makes more sense to redirect output to a separate file as it progresses. “Export Text As...” is really not well suited for that and the chances of it failing regardless of RAM amount are quote high. Best of luck with this! – Giacomo1968 Jun 01 '21 at 02:10
  • 3
    Just a tip, add in the ability to restart the script from a given point, because a week of calculations is just asking to be interrupted... – Moo Jun 01 '21 at 08:27

1 Answers1

6

To answer your primary question, the Terminal scroll back buffer size is limited only by your RAM on the machine (16GB). Without knowing what your process is outputting, it is hard to know if this will present an issue or not.

However, rather than depending on the scroll buffer integrity, especially for a process that runs in time measured in weeks, you could instead redirect the output to a file that can be inspected over time.

One way to do this might be to use the logging capabilities of SageMath.

An alternate way might be to create a standalone script, and if you are interested in what would have shown in the terminal (STDOUT), it can be redirected to a file:

your_sage_script > sage.log

Then the data will be redirected to the file sage.log, and can be inspected as that file grows, without affecting the execution of the script itself.

The above ways would be a much better way of ensuring you are saving the I/O of your process. For a script that will take weeks to complete, it is worth looking into these methods.

Giacomo1968
  • 53,069
  • 19
  • 162
  • 212
Scot
  • 408
  • 2
  • 9
  • 2
    Thank you for this helpful info. I will write a standalone script. – mathematrucker Jun 01 '21 at 01:45
  • 4
    Nice extra benefit of sending the output to a file: It will be faster too! Console output (in Terminal) with scrolling the display up on each new line and appending each line to a HUGE scrollback buffer is a relatively slow action. All that gets bypassed if you redirect the output straight to a file. On small amounts of program output it won't matter much, but in this case it probably saves you hours of overall run-time. – Tonny Jun 01 '21 at 07:46