Dynamic programming 0

What is dynamic Programming ?

Dyanmic programming is a design paradime, that helps eliminate redundant computations. In a sense, the mathematical problem has to have nice sub-structures to allow use of sub problems solution(s) to achieve the solution of the problem at hand.

A great example is Fibonacci sequence - 1, 1, 2, 3, 5, 8, 13 ... where the solution of Fib(N) = Fib(N-1) + Fib(N-2), and Fib(N) means Fibonnaci number of N > 1.

Where it came from ?

In early 50s and 60s of last century, there were optimization problems related industralization that brought new insights to understand the substructures of problems and use them. This soon became new field of Mathematics called Operations Reseachs. Bellman is one of the founder of this branch of Mathematical Analysis to optimization problems.

What is an Optimization problem ?

When we have a finite resource, how could we use it the most efficient way is in essence and optimization problem. Ex: If you have 100 bucks to spend for a day, and you are given the alternatives with cost of those alternatives. How would you like to optimize the use of 100 bucks.

So there is an Objective function ( optimization function) that we want to maximize or minimize. And there is/are constraints or bounds that keep the scope of the optimization we can use. So it would be fair not to tackle these problems by extreemly rich persons ( fill the names here) to optimize grocery budget of a month, for example.

Why it is important for Programmers ?

There are lot of problems being solved using computers. And lot of problems require lot of computing resources when programmatically solved. Some large problems are impossible to solve using other means.

Where is the place to get an understanding of Dynamic Programming ?

You can find lots of Algorithm books, online documents etc. But the gound work has to be from Bellman, et. all books. Reason being that is that way, is because identifying sub stuctures is the key.

Common problems that are being used for interviews are --- 

Longest Common Subsequence

Longest Increasing Subsequence

Edit Distance

Minimum Partition

Ways to Cover a Distance

Longest Path In Matrix

Subset Sum Problem

Optimal Strategy for a Game

0-1 Knapsack Problem

Boolean Parenthesization Problem

Shortest Common Supersequence

Matrix Chain Multiplication

Partition problem

Rod Cutting

Coin change problem

Word Break Problem

Maximal Product when Cutting Rope

Dice Throw Problem

Box Stacking

Egg Dropping Puzzle

 ...


 


Posted on Sunday, April 8, 2018 at 02:42PM by Registered CommenterProkash Sinha | CommentsPost a Comment

When AI > NI

These days, Artificial Intellegence > Natural Intellegence. The days of Frankenstien finally arrived. Every high-tech company is investing or at least broadcasting that they are using or very soon would be using AI. Sooooo, I started looking back of what we were doing 30 yrs back in Graduate school ! Back then AI was mostly natural language based expert systems or some Heuristics based expert systems. For example, computer based chase player. Robotics, Newral network etc., were in their dawn days. We did not have many branches like: ML, supervised learning, deep learning etc.  Took a class in AI, later in mid semister dropped out, did not make much sense.

Then came the Statistical / Baysian decision process, and now Ai is as pervasive as Cell Phone. I did a sampling of some of the books in AI, and got stuck with the book - Deep Learning! Looking at the book gave me an immense joy, and disappoint too. The book is wonderfully written, by my first glance of few chapters. Joy was that most of the topics are covered back then in different form and purpose. Like statistical methods, Probablity and Measure toward Discrete Martingale, Operations Research, Stochastic processes etc. Disapointment is there is really a Deep Learning again for me, if I need to look at them again...

Big question is -- How do we make a massivelly parallel machine to learn ? How could we make the machine available for almost all areas of scientific researches ?

 

There are many areas that needs to be covered before even I could start saying something ...

One such area is Dynamic Programming, that is essential for any optimization problem that has overlapping substructures. Finding such structures and using them needs key insights of the problem at hand.

So I will start with some dynamic programming next.

 

 

 

 

 

 

Posted on Sunday, April 1, 2018 at 05:47PM by Registered CommenterProkash Sinha | CommentsPost a Comment

Design,Resign ...

Do I really like to do the same thing time and again ? That's the question I ask to myself. The answers are different for different situations. Some tasks I can't left to others - like taking a shower etc. And it is repetitive that I can not avoid :-). Others I really don't want to repeat.

One such example is tracing your program thru debug messages, so you know when was the last time, the program behaved as expected, or at least hope for.

So how could I write a debugging module that would take care most of my tracing work, when in need to trace them. Yeah, you will say - as if none of us seen this before. And you are right. We have seen a lot of them. Sometime they are old cruft, and codes are defacto standard copy/paste from even older code. No one really likes to trace/debug, or at least not much of it. Hence the result.

Recently, due to way many things coming into software complexity, people are trying to have different debug tracing, and even systems support logging and tracing. Some of them are system supplied, and you can not change the behaviour and sometime very hard to share with experts on the other side of the globe or galaxy.

Questions aboud -

1) How can I use some design, that does not restrict me in most ways, like use of common languages like: Java, C++, object-C, Swif etc.

2) How can I use in a location independent way ?

3) How can I turn on/off tracing at will and control different types of tracing messages I'm interested in ?

4) How can I make the fairly secure and private ?

5) How can I have them thead safe, so that it does not get in the way of debugging / tracing itself?

etc., etc...

All of these boil down to one thing, in some sense, decouple every little feature(s) as it is always the fact that we don't have complete requirements aprioi. All most always, we will see some thing is either not present or never thought about while at the design table.

All of these, leads to open ended design, so we can plug and play as we go along without compromising structure integrity by large magnitude. And translate to Design Patterns. Mapping requirements to existing design patterns ( or to even come up with new ones ) is the process that I'm talking about by taking an example implementation of a sofware component.

In summary, it will have the formatting meta data to know where is log is coming from, it will have the privacy/encription etc., it will allow dynamic control of what we trace and where the traces go, it should be able to allow multiple related software programming languages... 

Basic design pattern for this I chose is called Observer pattern. This is part of a class of patterns named Behavioural pattern. The choice of implementation language is C++, since the C interface is common among the languages I mentioned above along with many more.

Note that not all patterns are simple, and some of them are even fairly complex to get it right. Here we choose Observer to decouple producer from consumer of the trace to dynamically control both trace message types and destinations of the messages. A message can land in any other remote machine, along with quick console debugging, and system file logging. Messages could also be for different reasons ( event types ) too.

Adapter pattern is used to capture multiplatform supports.

 

 

 

 

Posted on Saturday, March 10, 2018 at 12:57PM by Registered CommenterProkash Sinha | Comments Off

Roller Coaster

It's been almost 30 years, when I first heard "Only thing is constant in High Tech is Change".

I can't think of anything better than that to reflect our experience, hence the notion about the state of affairs of it.

So what are the Changes, that we should be worried, interested, or otherwise curious ? - It's just two things. First Deep Learning, next Quantum computing. In another 5 years, they would be the main stream of knowledge(s) one would be required to someone who would - For me only thing that is constant is Change.

I like those two area, just because I've some bias in Mathematics and Statistics. I think once I was quite well trend in those areas, but was not easy to have a job where I could use for fun and profit !

Quantum Computation and Quantum Information and Deep Learing are two books I'm about to embark on. Well, I have had to start, so I started now.

Why are they important? Well, they are being already exploited, and going to be the next stage of High Tech. Most people using wide variety of devices directly or indirectly already gets the benifits of them, jus they don't know that behind the scene those are cropping up to play major roles. In one word - AI.

 Recently, I got engaged in Corporate Research for new and customer facing engaging product(s). Yeah, only thing constant is ... Also I really did not want to retire being called Senior engineer. I've had that title for quite some years, and I know very well when someone is retired, they get the title Senior and would be waiting for Tuesday of everyweek to get discount on almost everything, at least where I live...

 

Posted on Wednesday, January 10, 2018 at 06:48PM by Registered CommenterProkash Sinha | CommentsPost a Comment

Tid Bits of stack

What is stack? And what it is used for?

Stack is really a way to handle modularity of software. In the long past stack was designed and crafted by hands when most softwares were built using Assembly language. First the idea is to breakdown large code to manageable functions, and being called by yet other functions. So there had to be a way to pass arguments at call time, and revert back to the state where it was among other things. The result is stack machine, in the sense that the underlying architecture have enough support to create standard template code for stack paradigm.

After that stack became a software pattern that is one of the Abstract Data Type. Queue, list, doubly ended queue are some of the others.

 Now a days, whenever one function calls another function there is stack management work going on.

At the start of a function ( in most systems) the following is a standard prologue -

push %rbp

move %rsp %rbp

rbp is the frame pointer, and rsp is the stack pointer. So system pushes frame pointer on the stack. Now the stack pointer is free to be messed up by the callee. And at the end it can be retrieved from %rbp ( frame pointer).

How ???  When we define local variables to be used by a function, system reserve space to hold those local variables value. Followng code just do that ---

sub 0x48 %rsp; 

You can see that %rsp now changed and pointing to lower addressed memory. Stack grows downward.

In this case the local variables taking 0x48 bytes to hold the local variables current values.

Now after the computation within this function, it has to put back the frame pointer, as well clean up the current stack built, using following ---

mov %rbp %rsp ;  // rsp was untouched ( never used for anything ), so it gets back the stack pointer.

pop %rbp

%rbp being the frame pointer, and never suppose to be used as a target of any operation except the prologue and the epilog ( previous two lines is called epilog), where ever a local variable is used in the computation, it uses %rbp like a reference point. Like

mov -0x8(%rbp) %rbx.

add  $0x1 %rbx

More ...

Posted on Monday, May 22, 2017 at 09:32PM by Registered CommenterProkash Sinha | CommentsPost a Comment