Posts

Showing posts from July 18, 2024

[Day 199] Continuing with Build an LLM from scratch

Image
 Hello :) Today is Day 199! A quick summary of today: saw that all the chapters from the book Build an LLM from scratch  have been published so I decided to continue with it (after a few moths of waiting) I like that even though we are in chapter 5 (out of 7), the author still reminds us the learners the process of going from input text to LLM generated text. Making sure we are still on the same page The goal of this chapter is to train a model because at the moment, when we try to generate some text we get gibberish At the moment the untrained model is given: Every effort moves you ; and the model continues this with  rentingetic wasnم refres RexMeCHicular stren By the way, the current (untrained) model has the following config:  How can the model learn ? It's weights need to be updated so they start to predict the target tokens. Here comes good ol' backpropagation. And it requires a loss function which calculates the dif between desired and actual output (i.e. how far off the

[Day 198] Transactions Data Streaming Pipeline Porject [v1 completed]

Image
 Hello :) Today is Day 198! A quick summary of today: data streaming pipeline project [v1 done] Here is a link to the project's repo. Well ... I did not know I can do it in a day (~14 hours) after yesterday's issues but here we are. Turns out in order to insert the full (~70 variables with nested/list structure), I need the proper pyspark schema. And yesterday I did not have that and that is why when I was reading data in the kafka producer I was getting NULL in the columns - my schema was wrong. Well today I not only fixed the schema for the 4 variables I had yesterday, but included *all* the variables that come from the Stripe API ~ 70 (for completeness).  When I run docker-compose, the data streams and is input into the postgres db (and is still running). Unfortunately, the free Stripe API for creating realistic transactions has a limit of 25, so every 3 seconds, 25 new transactions are sent to the db. It has been running half the day (since I got that set up) and as I am wr