Results 31 to 46 of 46
-
2020-10-14, 06:17 AM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
The line that says 'print(text)' is where the outputs are being generated. So you could instead write that to a file, like (at the same indent level where print is):
Code:f = open("output.log","a") f.write(text) f.close()
Last edited by NichG; 2020-10-14 at 06:18 AM.
-
2020-11-04, 01:05 AM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
I figured this one out btw. It can be opened with notepad and apparently it just points the program to the correct name for the data file, so if it said something like:
Code:model_checkpoint_path: "model-221" all_model_checkpoint_paths: "model-210" all_model_checkpoint_paths: "model-221"
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-02-25, 02:45 AM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
I can't remember what button to press to interrupt the generation process
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-02-25, 04:15 AM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Ctrl-C will kill a process. Is there a built in thing?
-
2021-03-27, 06:40 PM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-03-27, 11:37 PM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Based on the code I pasted before, it looks like nsamples would work. Can you paste the current version of your code?
Edit: Okay, maybe the issue is that you're setting nsamples in the console. You have to change what you pass to that function, wherever it gets called. You could change nsamples=1 in the function definition to, say, nsamples = 10 or whatever. Or pass it as an argument in the console: interact_model(nsamples = ...)Last edited by NichG; 2021-03-27 at 11:38 PM.
-
2021-03-28, 02:59 PM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
I tried changing it to 3 in the code and it still just displayed one thing. I think maybe it's processing several but not displaying them.
Here's the code as it stands
Code:#!/usr/bin/env python3 import fire import json import os import numpy as np import tensorflow as tf import model, sample, encoder def interact_model( model_name='117M', seed=None, nsamples=3, batch_size=1, length=None, temperature=1, top_k=0, top_p=0.0 ): """ Interactively run the model :model_name=117M : String, which model to use :seed=None : Integer seed for random number generators, fix seed to reproduce results :nsamples=1 : Number of samples to return total :batch_size=1 : Number of batches (only affects speed/memory). Must divide nsamples. :length=None : Number of tokens in generated text, if None (default), is determined by model hyperparameters :temperature=1 : Float value controlling randomness in boltzmann distribution. Lower temperature results in less random completions. As the temperature approaches zero, the model will become deterministic and repetitive. Higher temperature results in more random completions. :top_k=0 : Integer value controlling diversity. 1 means only 1 word is considered for each step (token), resulting in deterministic completions, while 40 means 40 words are considered at each step. 0 (default) is a special setting meaning no restrictions. 40 generally is a good value. :top_p=0.0 : Float value controlling diversity. Implements nucleus sampling, overriding top_k if set to a value > 0. A good setting is 0.9. """ if batch_size is None: batch_size = 1 assert nsamples % batch_size == 0 enc = encoder.get_encoder(model_name) hparams = model.default_hparams() with open(os.path.join('models', model_name, 'hparams.json')) as f: hparams.override_from_dict(json.load(f)) if length is None: length = hparams.n_ctx // 2 elif length > hparams.n_ctx: raise ValueError("Can't get samples longer than window size: %s" % hparams.n_ctx) with tf.Session(graph=tf.Graph()) as sess: context = tf.placeholder(tf.int32, [batch_size, None]) np.random.seed(seed) tf.set_random_seed(seed) output = sample.sample_sequence( hparams=hparams, length=length, context=context, batch_size=batch_size, temperature=temperature, top_k=top_k, top_p=top_p ) saver = tf.train.Saver() ckpt = tf.train.latest_checkpoint(os.path.join('models', model_name)) saver.restore(sess, ckpt) f = open("sample.txt","r") raw_text = f.read() f.close() context_tokens = enc.encode(raw_text) generated = 0 for _ in range(nsamples // batch_size): out = sess.run(output, feed_dict={ context: [context_tokens for _ in range(batch_size)] })[:, len(context_tokens):] for i in range(batch_size): generated += 1 text = enc.decode(out[i]) print("=" * 40 + " SAMPLE " + str(generated) + " " + "=" * 40) print(text) print("=" * 80) if __name__ == '__main__': fire.Fire(interact_model)
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-03-28, 08:54 PM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
It looks like maybe you dropped an indent. Instead of:
Code:for _ in range(nsamples // batch_size): out = sess.run(output, feed_dict={ context: [context_tokens for _ in range(batch_size)] })[:, len(context_tokens):] for i in range(batch_size): generated += 1 text = enc.decode(out[i]) print("=" * 40 + " SAMPLE " + str(generated) + " " + "=" * 40) print(text)
Code:for _ in range(nsamples // batch_size): out = sess.run(output, feed_dict={ context: [context_tokens for _ in range(batch_size)] })[:, len(context_tokens):] for i in range(batch_size): generated += 1 text = enc.decode(out[i]) print("=" * 40 + " SAMPLE " + str(generated) + " " + "=" * 40) print(text)
Last edited by NichG; 2021-03-28 at 08:54 PM.
-
2021-03-29, 09:03 AM (ISO 8601)
- Join Date
- Dec 2009
- Location
- Birmingham, AL
- Gender
Re: Help a non-programmer install GPT-2 (and its prerequisites)
The Mod on the Silver Mountain: Thread timed out.
Cuthalion's art is the prettiest art of all the art. Like my avatar.
Number of times Roland St. Jude has sworn revenge upon me: 2
-
2021-03-30, 07:22 AM (ISO 8601)
- Join Date
- Mar 2007
- Location
- Grognardia
- Gender
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Metamagic Mod: thread re-opened.
(Avatar by Cuthalion, who is great.)
-
2021-05-01, 03:09 PM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-05-01, 06:48 PM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Nope. I thought it could be because the library moved where it put the random stuff in its hierarchy, but I checked and tf.random should exist.
I should also say, I don't actually use this particular implementation anymore. I'd probably use Huggingface's implementations since they have a standardized interface, and they're expanding to things like a community-trained GPT-3 model for example. But if you finally have this one working I can understand not wanting to switch.
-
2021-06-08, 12:00 AM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
I'm actually not sure if my computer would be able run GPT-3. It barely runs GPT-2.
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-06-08, 12:56 AM (ISO 8601)
- Join Date
- Dec 2010
Re: Help a non-programmer install GPT-2 (and its prerequisites)
One of the ways to do this kind of stuff is to use Google Colab instances. You can basically use one for free for something like 12 hours a day, and you can get a GPU allocation that can run most of the stuff people are playing with. The code basically loads in a Jupyter Notebook interface and for a lot of the good ones, there's a simple 'enter prompt here and press Run All' kind of workflow to it.
I'm not sure there's one for the community GPT-3 yet, but these things exist for DALL-E and various other interfaces to CLIP (various kinds of 'specify a sentence and it will draw the image' models that have exploded this year).
-
2021-10-02, 02:26 PM (ISO 8601)
- Join Date
- Feb 2016
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Do you know what the recommended system requirements are to run the 345M model locally, and/or what the minimum system requirements are to run the 345M model without risking a system crash (as happened the last time I attenpted to train the 345M model)
"If you want to understand biology don't think about vibrant throbbing gels and oozes, think about information technology" -Richard Dawkins
Omegaupdate Forum
WoTC Forums Archive + Indexing Projext
PostImage, a free and sensible alternative to Photobucket
Temple+ Modding Project for Atari's Temple of Elemental Evil
Morrus' RPG Forum (EN World v2)
-
2021-10-02, 07:02 PM (ISO 8601)
- Join Date
- Mar 2007
- Location
- Grognardia
- Gender
Re: Help a non-programmer install GPT-2 (and its prerequisites)
Metamagic Mod: thread Necromancy
(Avatar by Cuthalion, who is great.)