Karpathy's new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

WBOY
Release: 2024-06-07 10:29:25
Original
422 people have browsed it

Master Karpathy is no longer satisfied with using C language to create Llama!

The latest challenge he gave himself: to reproduce the classic results of OpenAI, starting with the basic version of GPT-2.

The success of the challenge itself is not unexpected, but it only cost 20 US dollars and 90 minutes to complete the training, and the loss and evaluation surpassed the original version, just! have! point! Pass! point! Got it! .

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

# Not only that, he wrote a complete tutorial on the reproduction process, and as expected it became popular again.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Since Karpathy rented the A100 cloud service, training the 124M version cost US$20.

However, someone followed the tutorial and ran with H100. Not only did the training time become shorter, but it also saved money: it was completed in 43 minutes and only cost 14 US dollars.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

In addition, Karpathy also spent US$200 from his own pocket to reproduce the 350M version of GPT-2 for everyone.

But the 1.5B large cup version, according to calculations, will cost 1 week and 2,500 US dollars, which is a bit unaffordable. The main reason is that he does not have H100 in his hand.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Fortunately, all the trenches are very generous and take action when it’s time to take action:

I’ll give it to you anytime you need it!

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Only charges you $2 an hour!

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

90 minutes to reproduce GPT-2

This time Karpathy reproduced GPT-2, still based on his llama.c code base. Complete training end-to-end.

The code base has been continuously improved by him these days, and now it is very simple to start training:

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Specifically, the network structure is GPT-2, but many super The parameter settings follow the set of GPT-3.

Karpathy analyzed that according to the standards of Chinchilla's law, GPT-2 training on 100B tokens should be over-trained, and the returns will be diminishing later. According to calculation, 2.5Btokens is enough for the 124M model.

However, he trained 10B tokens himself, and the training data also used FineWeb, which was just released. The token quality is higher than the original OpenAI WebText data set.

The original WebText has never been made public, and it is impossible to experiment with control variables under the same conditions. In addition, the distribution of Internet data today may be very different from that of 5 years ago.

It is speculated that these differences may be the reason why the review score is higher than the original version.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

In addition, some netizens noticed that the GPU utilization efficiency during training is also higher than that of OpenAI work, but Karpathy said that it is mainly due to the use of a single cloud service node and does not need to be considered. Inter-server communication issues.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Finally, for the 350M version of GPT-2 that has been trained, it also achieved results that surpassed the original version.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Applause rings~

Great God is not that volume

Since resigning from OpenAI in February this year, Karpathy has used C The language has produced many large model results, and I have played with it from Llama to GPT.

Observing his GitHub heat map, I only took a break for a while at the beginning, and it became more and more popular after entering April.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Is this the rhythm of resigning and staying at home to do 997?

Actually, Karpathy has also traveled during this period and shared the games he was playing, which were not that overwhelming.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

According to the weekly schedule he posted: 975 hours on the job, and 4-20 hours of work after resignation, depending on his mood.

  • Work 4 hours on Monday,
  • Work 14 hours until 11pm on Tuesday
  • I had insomnia on Wednesday, got up at 4 o'clock to write code, and collapsed at noon
  • Worked 20 hours on Thursday
  • Rest on Friday
  • 12 hours on Saturday
  • 4 hours on Sunday
  • Then I went out to travel for two weeks.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

#Everyone is curious after seeing this. Does a regular arrangement feel better, or does randomness have miraculous effects?

Karpathy himself is not sure, but a chaotic schedule is definitely more interesting.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Finally, he also shared a freelancing experience:

Start working directly after getting up, without reading any news, and then go online after lunch to avoid the outside world Information is distracting.

Karpathys new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch

Friends who have the conditions can try it.

Tutorial: https://github.com/karpathy/llm.c/discussions/481.

Reference link:
[1]https://x.com/karpathy/status/1795484547267834137.
[2]https://www.threads.net/@karpathy.

The above is the detailed content of Karpathy's new tutorial goes viral, and netizens rush to give him H100: Recreate GPT-2 training from scratch. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:51cto.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!