insertcacheorselectpaymenttype

@insertcacheorselectpaymenttype@programming.dev
0 Post – 1 Comments
Joined 1 years ago

As I understand it, the big innovation that allowed things to advance so fast is called "attention".

Basically it's not just learning which word is next, it's also learning which of the previous words is the most important context for predicting the next word. This is what allows it to learn the grammatical structure of language which is important for a couple reasons.

For one it can use that info to better identify if data in the training set is related to the subject at hand. E.g. it can better pick out keywords so it knows that 2 data sets with the word "the" might not be related at all, but 2 data sets with the word "silicon" are probably highly related.

It might even be able to "understand meaning" by finding the relationships of words to each other. If it comes across the phrase "minor the key and not the child", it can figure out not only that "key" is important to the subject at hand, but also that information about children is actively harmful to predicting the next word.

It then uses this information to stay on topic, avoid mixing nonsense sentence structures together, and increasingly-- predict what an expert in the field might say.

There are more tricks to it than this, including ones that don't know about or understand, but I've heard that the recent advancement is largely due to "attention"