Cross-Entropy and Kullback-Leibler Divergence

In the context of information theory, cross-entropy is used to calculate the expected length of code representation for a message given we’re assuming a probability distribution \(q\) of the message’s...

Deriving Poisson Probability Function from Binomial Distribution

The following is the formula to calculate the probability of an event \(X\) happening \(k\) times in a time period given that the rate of occurence of the event in...

Konigsberg Bridge Problem

This article is originally a test page for a few JavaScript libraries and plugins I can use to write code to be processed as rendered graph on the client side....

Shannon Entropy and Uncertainty of Information

Shannon entropy is commonly used in malware analysis, and I actually started writing this article after an attempt to better understand Shannon entropy after reading this paper by Duc-Ly Vu...

Image Processing Convolution Kernels

As someone who’s relatively inexperienced with image processing, I started by looking at random image kernels on the web before applying it on the Matlab environment while I’m testing some...