Skip to main content

Markov chain in JavaScript

I made a small Markov Chain joke generator during my coffee break sometime last week. This is in continuation to the last post, where we did a similar thing. I did this specifically to see how well it could be extended in a language which I have typically not used before for ML/NLP.


Let me run you guys through it.

First of all, the Markhov Chains need a bunch of data to tell it how exactly you want your sentences constructed.


str_arr=[sentence1, sentence2,...]


Next, we create a dictionary of all trigrams present across the sentences. To do this, we use all bigrams as keys, and the succeeding word as the corresponding values. The key-value pairs thus form a trigram.
As an example, consider the sentence : “The man had a dog.”
The dictionary for this sentence will have :
[ {[The, man] : [had]}, {[man, had] : [a]}, {[had, a] : [dog]} ]

Next up, using the dictionary that we just made to create sentences. Here we provide the first two words, and let the function work its magic to complete the sentence. The first two words are used as key to search the dictionary for a candidate third word, which is appended to the first two words. Then the second and third words are taken as key, and so on. If there are multiple words as succession candidates for a particular pair, any one of them becomes the Chosen One randomly. The process continues until no succeeding word is found, and the words collected till then form our new sentence.

That’s it! Some observations I would like to make here: One could try to extend the trigrams to n-grams, but complexity will be going up. Instead of selecting from candidate words randomly, one can have a probability-based selection as well. Instead of just sentences as input (and output) one can have paragraphs and even,(if we dare dream so high), whole essays.

Comments

Popular posts from this blog

Year 2016 in review and goals for 2017

Hello people,
It's my 34th birthday today and I wanted to put the past year in review and where I wanted my life to go in the next year.

Achievements of this year: Machine learning course by Andrew Ng ( completed )Calculus I by Robert Ghrist ( completed )Calculus II by Robert Ghrist ( completed )Probability and Statistics ( 2 weeks left )Data science - pandas ( 1 week done ) My first linear regression program Built a neural network from scratch My first Regex. From scratch. No references. With tests.Algorithms I by Robert Sedgwick ( only audit )Algorithms II by Robert Sedgwick ( only audit )
Also did the BE subjects for CS, all the stuff I had learned over the years.
I am super happy to know that MOOCS help a lot in career advancement.
Self-help books that really helped: How to win friends and influence people - Dale CarnegieA mind for Numbers - Barbara Oakley
Altogether, a pretty good year, where learning is concerned. Things/Tips that helped me while learning: Very accommodating p…

When you say........

......you don't believe in god, you admit there is a god to believe in

This is the meme post that started this train of thought in my mind.



I have heard one of my classmates say a long time ago, if women truly believed that they were equal to men, then they would not fight for it. While I never accepted it, i didn't know how to speak against it either. but it was there, somewhere in my mind. I think i made some sense out of it, at last.

This is stupidity. This is like saying 'If we Indians seriously believed that freedom was ours, we would never have had to fight for it'. The fight for independance was a fight to make the other party understand and/or accept your viewpoint. The first resonable method might be to consider the other person's viewpoint. And using that as a base point,then work, with suitable proof and arguments, raise, alter, or reconstruct their viewpoint to match ours. This method follows the logic that people can and will be fair in an argument.…