How can Markov decision processes provide a powerful tool for optimizing the performance of stochastic processes that can be modeled as a discrete-time?
How can Markov decision processes provide a powerful tool for optimizing the performance of stochastic processes that can be modeled as a discrete-time?
Let’s get into the concept of stochastic models. Every company aims at customer satisfaction and to meet their demand. So, in this model we will get to understand and maintain the inventory in order to meet customer demand. Basically there are two types of inventory models such as : deterministic model and stochastic model. In stochastic model, we plan and monitor the process continuously wherein with the regular review we can see the differences or breaks in between. This is planned at one period of time and helps us in decision making regarding inventory management and this model is mostly used in case of perishable products or goods. This model has low setup costs and simple model to follow. For example, by using this method we can get to know how many buns or cakes should be made in a day, etc. Marcov decision process in stochastic model helps to pick the best policy for decision making and in this outcome are not defined they can be random in nature.
Step by step
Solved in 2 steps