🦜 Bounded Rationality
@bjlkeng.github.io@rss-parrot.net
I'm an automated parrot! I relay a website's RSS feed to the Fediverse. Every time a new post appears in the feed, I toot about it. Follow me to get all new posts in your Mastodon timeline!
Brought to you by the RSS Parrot.
---
Understanding math, machine learning, and data to a satisfactory degree.
Your feed and you don't want it here? Just
e-mail the birb.
The Logic Behind the Maximum Entropy Principle
http://bjlkeng.github.io/posts/the-logic-behind-entropy/
Published: August 3, 2024 00:44
For a while now, I've really enjoyed diving deep to understand
probability and related fundamentals (see
here,
here, and
here).
Entropy is a topic that comes up all over the place from physics to information
theory, and of course, machine learning. I…
Iterative Summarization using LLMs
http://bjlkeng.github.io/posts/iterative-summarization-using-llms/
Published: June 4, 2024 00:21
After being busy for the first part of the year, I finally have a bit of time
to work on this blog. After a lot of thinking about how to best fit it into my
schedule, I've decided to attempt to write shorter posts. Although I do get
a lot of satisfaction…
A Look at The First Place Solution of a Dermatology Classification Kaggle Competition
http://bjlkeng.github.io/posts/a-look-at-the-first-place-solution-of-a-dermatology-classification-kaggle-competition/
Published: December 23, 2023 00:09
One interesting thing I often think about is the gap between academic and real-world
solutions. In general academic solutions play in the realm of idealized problem
spaces, removing themselves from needing to care about the messiness of the real-world.…
LLM Fun: Building a Q&A Bot of Myself
http://bjlkeng.github.io/posts/building-a-qa-bot-of-me-with-openai-and-cloudflare/
Published: September 25, 2023 00:56
Unless you've been living under a rock, you've probably heard of large language
models (LLM) such as ChatGPT or Bard. I'm not one for riding a hype train but
I do think LLMs are here to stay and either are going to have an impact as big
as mobile as an…
Bayesian Learning via Stochastic Gradient Langevin Dynamics and Bayes by Backprop
http://bjlkeng.github.io/posts/bayesian-learning-via-stochastic-gradient-langevin-dynamics-and-bayes-by-backprop/
Published: February 8, 2023 23:25
After a long digression, I'm finally back to one of the main lines of research
that I wanted to write about. The two main ideas in this post are not that
recent but have been quite impactful (one of the
papers won a recent ICML
test of time award). They…
An Introduction to Stochastic Calculus
http://bjlkeng.github.io/posts/an-introduction-to-stochastic-calculus/
Published: September 12, 2022 01:05
Through a couple of different avenues I wandered, yet again, down a rabbit hole
leading to the topic of this post. The first avenue was through my main focus
on a particular machine learning topic that utilized some concepts from
physics, which naturally…
Normalizing Flows with Real NVP
http://bjlkeng.github.io/posts/normalizing-flows-with-real-nvp/
Published: April 23, 2022 23:36
This post has been a long time coming. I originally started working on it several posts back but
hit a roadblock in the implementation and then got distracted with some other ideas, which took
me down various rabbit holes (here,
here, and
here).
It feels…
Hamiltonian Monte Carlo
http://bjlkeng.github.io/posts/hamiltonian-monte-carlo/
Published: December 24, 2021 00:07
Here's a topic I thought that I would never get around to learning because it was "too hard".
When I first started learning about Bayesian methods, I knew enough that I
should learn a thing or two about MCMC since that's the backbone
of most Bayesian…
Lossless Compression with Latent Variable Models using Bits-Back Coding
http://bjlkeng.github.io/posts/lossless-compression-with-latent-variable-models-using-bits-back-coding/
Published: July 6, 2021 16:00
A lot of modern machine learning is related to this idea of "compression", or
maybe to use a fancier term "representations". Taking a huge dimensional space
(e.g. images of 256 x 256 x 3 pixels = 196608 dimensions) and somehow compressing it into
a 1000…
Lossless Compression with Asymmetric Numeral Systems
http://bjlkeng.github.io/posts/lossless-compression-with-asymmetric-numeral-systems/
Published: September 26, 2020 14:37
During my undergraduate days, one of the most interesting courses I took was on
coding and compression. Here was a course that combined algorithms,
probability and secret messages, what's not to like? 1 I ended up not going
down that career path, at least…