Chess and Statistical Mechanics

If you want to understand the working of any complex system then you would first try to break everything down to its fundamentals. Then you would understand the fundamentals very clearly and then use that understanding of the fundamentals to completely understand the system. This is the very basis of Physics, which is one of the most successful endeavours of humankind. We have been able to take this approach whereby we break down systems to fundamentals and then use its knowledge to have the most detailed and clear understanding of a lot of things. ...

November 4, 2024 · Prakash

सारापोभा

“ऊ ठ्याक्कै सारापोभा जस्तै छे।” “को सारापोभा?” “मारिया सारापोभा क्या त।” “को मारिया सारापोभा?” “उही क्या त टेनिस खेलाडी, रूसी।” “ए त्यो पो?” “कसरी छे उस्तै? उचाइ हेर्दा त मारिया सारापोभाकी आधी जस्ती देखिन्छे।” “यहाँ उचाइको कुरा कसले गरेको छ र? हेर्दा उस्तै छे।” “कसरी? तैँले पहिले कहिल्यै सारापोभालाई भेटेको छस् र?” “भेट्नै परेन नि, अनुहार उस्तै लवज, उस्तै बोलिचाली उस्तै।” “बोलीचाली उस्तै रे? कसरी, त्यसले पनि रूसी भाषा बोल्छे र?” ...

July 23, 2024 · Prakash

MNIST Handwriting Recognition with Neural Network

In the previous post we discussed how to make functional Neural Network (NN) with julia. In another post we discussed how to package our NN into an independent julia package. As described in that post , we are able to import the new package as using PNN Here PNN is the new NN package that we have written from scratch . One of the most common test cases for new Machine Learning algorithms is, the publicly availabel MNIST human handwriting dataset. We can use that dataset to train and test our new Neural network to verify that this package can indeed be used in real world application. It kind of blows my mind that such a simple little manipulation of few matrices can do such a complex task as handwriting recognition, but anyway here we are. ...

June 24, 2023 · Prakash

Neural Network julia Package

In a seris of previous posts we, described how to make a fully functional Neural Network in julia. Lets create a package so that we can import the neural network in julia as a regular package available in every session. Lets call this new package PNN. As described in this post create a directory PNN somewhere in a path. $ mkdir -p /some/path/PNN/src $ cd /some/path /some/path $ ls src Inside the src directory create a file utils.jl with the following content ...

June 8, 2023 · Prakash

Package in Julia

a brief walkthrough of making package in julia

June 8, 2023 · Prakash

Neural Network with julia

After a series of previous posts we hav learned the key steps of making a Machine Learning: Gradient Descent Back Propagation Adding Multiple Nodes Changing Activation Function With all of this we are in a position to construct a fully functional neural network. We combine knowledge from all of these previous posts and try to construct a fully functional multi-layer multi-node Neural Network. Setup For a fully functional network, similar to last post , we can make multiple layer multiple nodes. ...

June 14, 2022

Basic Neural Network with activation function

As we saw in previous post addition of any number of layers and any number of nodes will only be able to model a linear functions. If we want our network to also be able to learn non linear functions, then we should introduce some non linearlity to the network. One way of adding non linearity is to use so called activation function. The idea is that we pass the output of each layer through a (non linear) function. Our examples so far can be considered to have been using a linear activation function. In particular, they all used identity activation function ...

June 10, 2022

Basic Neural Network with multiple nodes

In the previous two posts we have already seen how to construct a single neuron neural network with or without hidden layer. If we want to input multidimensional input and get multidimensional output from the netwro then we have to add more nodes in each layers so that each node in the layer can process independent multidimensional input. For this purpose in this post we will construct a network with a single output layer but with multiple nodes in input layer. This is useful for example if we want to train our network to learn the scalar (one number) function of $N$ dimensional vector. ...

June 4, 2022

Backpropagation

Setup Similar to last post , where we made a basic neural network with just one neuron and no hidden layer, we will build a two neuron and a single hidden layer. %% {init {'theme':'dark'}} %% flowchart LR; a((a_0))-->|W_1| b((b_1))-->|W_2| c((b_2)) \begin{align*} a^1 = W^1 a^0 + b^1\\ a^2 = W^2 a^1 + b^2 \end{align*}So the final output of the network as a function of $x$ input becomes \begin{align*} y = W^2 \left( W^1 \cdot x + b^1 \right) + b^2 \end{align*}This is linear in $x$. So we should be able to model any linear function with this network. Like before let us try to learn a linear model through this primite network. In the last post we trained the network to learn the linear model ...

May 27, 2022

Basic neural network with julia

In this post I will try to describe building a basic regression model with gradient descent. This forms a very basic prototype of neural network. This by no means is a full fledged neural network, but this makes very fundamental foundation of neural network. Feedforward to calculate the error and back propagation to update the weights and biases from the error function. The code cells can be collected into a script and run as an individual julia script to reproduce everything that is described here in this article. ...

May 16, 2022