🟢 LLM Settings
- Understand Temperature, Top P, and Maximum Length
We can use certain LLMa settings to control various aspects of the model, such as how 'random' it is. These settings can be adjusted to produce more creative, diverse, and interesting output. The Temperature, Top P and Max Length settings are most important, but we describe every setting that the OpenAI Playground allows you to modify.
The output produced with a higher temperature setting offers a more imaginative and diverse list of activities to do at the beach. This can be very useful for creative writing.
If you adjust the temperature too high, you can get non-sensical outputs like
Start a sponge-ball baseball home run contest near Becksmith Stein Man Beach.
Top Pb is a setting in language models that helps manage the randomness of their output. It works by establishing a probability threshold and then selecting tokens whose combined likelihood surpasses this limit.
The cat climbed up the ___. The top five words it might be considering could be
window(probability .07) and
carpet, with probability of .03.
If we set Top P to
.90, the AI will only consider those tokens which cumulatively add up to at least ~90%. In our case:
tree-> total so far is
- Then adding
roof-> total becomes
- Next comes
wall, and now our sum reaches
So, for generating output, the AI will randomly pick one among these three options (
wall) as they make up around ~90 percent of all likelihoods. This method can produce more diverse outputs than traditional methods that sample from the entire vocabulary indiscriminately because it narrows down choices based on cumulative probabilities rather than individual token
Other LLM Settings
There many other settings that can affect language model output, such as stop sequences, and frequency and presence penalties.
Even when Temperature and Top-P are set completely to zero, the AI may not give the same exact output every time. This is due to randomness in GPU (graphics processing unit) calculations being done in the AI's "brain".
In conclusion, mastering settings like temperature, top p, maximum length and others are essential when working with language models. These parameters allow for precise control of the model's output to cater to specific tasks or applications. They manage aspects such as randomness in responses, response length and repetition frequency among other things—all contributing towards improving your interaction with the AI.
Partly written by jackdickens382