[Day 128] IBM Consulting Insights Virtual Careers event + More of CS109

 Hello :)
Today is Day 128!


A quick summary of today:
  • watched a few more lectures of Stanford's CS109 Probability for computer scientists


Firstly, for the IBM consulting event

It was an introduction to the different positions that IBM offers in their Consulting department. It was through webex, and many 'IBMers' showed up. From interns to graduates, to permanent employees and employees from different backgrounds. Below are some pics I took of the presentation. 

They presented how IBM thinks of consulting
And the importance of putting clients first, embracing new perspectives, open ecosystems, and collaborative approaches, and restless reinvention and innovation. 
They shared some habits of success to flourish in our careers:
  • build client trust
  • collaborate to succeed
  • grow with endless curiosity
  • embrace diverse perspectives
  • innovate with purpose
  • deliver with impact
What skills do we need for consulting? Here is what IBMers think
They also shared about their project development process.

And how they employ modern techniques like Agile using retrospectives, standups, work breakdown, and planning walls.

And very important: they want all IBMers to want and be excited to learn. Because they have all of these support systems established for continuous learning.

We were also introduced to IBM's learning platform called SkillBuild. And invited to join a specially crafted learning path for the Consulting virtual event
Covering the below courses
I will put this on my to-do list ^^

And ~ we got a certificate 


Secondly, about CS109

Lecture 17: Adding Random Variables

Talked about adding two distributions, adding random vars, or convolutions (if you want to sound fancy, as professor Piech said haha). 

Sum of 2 uniforms looks like a triangle
What if we add many of any distribution?

The Central Limit Theorem

The professor's teaching method is so nice, I wish I found this earlier in my journey haha.

Lecture 18: Central Limit Theorem

Finding sample mean and variance, and the variance of the sample mean. We saw how the distribution of the sample means from a population, regardless of the original distribution of the population, will tend to follow a normal distribution as the samples increase.

Lecture 19: Bootstraping and p-values
Talked about how to estimate the sampling distribution of a statistic - in this case mean and variance. And once we have some values, how to test if they are good estimaters - p-values.

Lecture 20: Algorithmic analysis

Learned about the law of total expectation


Tomorrow, the lectures begin with maximum likelihood estimation and ML. I am excited to see how professor Chris Piech will introduce these. I really wish I found these lectures at the start of my journey, but just watching them now and confirming my knowledge still feels good. And also the professor is amazing!


That is all for today!

See you tomorrow :)

Popular posts from this blog

[Day 198] Transactions Data Streaming Pipeline Porject [v1 completed]

[Day 107] Transforming natural language to charts

[Day 54] I became a backprop ninja! (woohoo)