Quote Originally Posted by NichG View Post
Ctrl-C will kill a process. Is there a built in thing?
Thank you. That's what I was thinking of

Quote Originally Posted by NichG View Post
So there's a bit in the code that says:

Code:
while True:
   raw_text = input("Model prompt >>> ")
   while not raw_text:
      print('Prompt should not be empty!')
      raw_text = input("Model prompt >>> ")
That's what's getting the text that you're going to process. You could replace that with something like:

Code:
f = open(filename,"r")
raw_text = f.read()
f.close()
Note that Python is picky about indentation, so when you remove that outer 'while True:' loop you have to make sure that everything under it is de-indented to the same level correctly. E.g. if you had

Code:
while True:
   aaa
   while False:
      bbb
   ccc
ddd
Then removing the outer while loop, it would become:

Code:
aaa
while False:
   bbb
ccc
ddd
As far as checkpoints, GPT-2 checkpoints should be rather large, no? So it's not surprising if they're not storing each one separately. So you'd have to pick which checkpoints you want to preserve somehow (e.g. copy them to a separate file during the training process, for example).
Is there any simple way to get this to keep spitting out samples over and over again. setting a value for nsamples in the console doesn't seem to work.