Being able to quantify the amount of information in a sequence is important in many fields and applies to many data set. Shannon Entropy is one such information theory method that given a random variable and historic about this variable occurrence can quantify the average level of information!
Generating art with AI is surprisingly easy! Here are 5 links to get you started!
This week I'll be sharing 5 books on Machine Learning that I highly recommend. No particular order, I read all of them and they were truly helpful throughout my PhD!
I remember when I first started doing research, I always felt bad and stressed whenever I generated a negative results. My thought process was that I must have messed up something along the way and that I wasn't good enough.
All assumptions, especially those underlying the project creation, should be documented and fact checked!
I've never seen a research / data science project that went exactly as planned upfront. Projects either change incrementally, with big pivots or fails.
A year and a half ago I've published a paper using some code I've created! In this blog post I've checked out if I would be able to reproduce it easily!
TLDR; navigating layer of abstraction with the least amount of mental overhead is not taught, it should.
For some reason, one type of help I've always been very good at providing is solving emotionally charged issues. I've drafted out the framework I'm using to tackle these so that others can get inspired from it!
I had a wonderful question asked to me today from one of my engineer about how much information should be documented in a task assigned to yourself.