GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text. The diversity of the dataset causes this simple goal to contain naturally occurring demonstrations of many tasks across diverse domains. GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
In this example we will look at how to use GPT2 in the low code platform.
Open Http in node and name it /gpt/:gpt also drag and drop http out node.
Open a python 3 function node and name it GPT2-TextGen. Also add the GPT2 code in the function.
Connect the Python 3 Node to console node and http out node. Click on deploy.
Open your browser type the interplay url ending /gpt/:gpt. You will see you get different prediction each time you refresh the page.
You can also see that on the terminal debug menu.