Für neue Autoren:
kostenlos, einfach und schnell
Für bereits registrierte Autoren
64 Seiten, Note: 9.31
LIST OF FIGURES
LIST OF TABLES
CHAPTER 1. INTRODUCTION
1.1. Bullwhip Effect
1.2. Complex Systems
1.3. Hurst Exponent
1.4. Importance of The Topic
CHAPTER 2. LITERATURE REVIEW
CHAPTER 3. METHODOLGY
3.1 Data collection
3.1.1 Computer Consumable
3.1.2 Automotive Components
3.1.3 Washing Powder
3.1.4 Fresh Juice
3.2 The Present Method
3.3 The Proposed Method
3.4 Hurst Exponent Calculation
3.4.1 Rescaled range analysis
3.4.2 Detrended fluctuation analysis
3.5 Lyapunov Exponent
3.6 Matlab Software
CHAPTER 4. COMPUTATIONL RESULTS AND ANALYSIS
4.1 Variance Change as a Function of H
4.2 Variance Change as a Function of Number of Data
4.3. Hurst Exponent Calculation from Data
4.3.1 Computer consumable-production
4.3.2 Computer consumable-demand
4.3.3 Automotive component-production
4.3.4 Automotive component-demand
4.3.5 Washing Powder-Production
4.3.6 Washing Powder-Demand
4.3.7 Fresh Juice-Production
4.3.8 Fresh Juice-Demand
4.4 Lyapunov Exponent Calculation from Data
CHAPTER 5. CONCLUSION AND FUTURE WORKS
Behind every successful endeavour there is the involvement of many inspiring people. It is with deep sense of gratitude that I acknowledge all helping hands that have contributed to the successful completion of the thesis.
Foremost in the list is Dr. K. Krishnakumar, Head of the Department of Mechanical Engineering for his support and encouragement. My deep thanks to, Prof. Mahesh S., Assistant Professor, Mechanical Engineering for guiding and correcting various documents with attention and care. He has taken pain to go through the thesis and make necessary correction as and when needed.
Above all, I thank God for the immense grace and blessings at all stages of the thesis.
Increasing demand from customers, high market competitions, fluctuating demand from customers and faster response, in addition to advancements in technology have turned the market into an unstable environment. All this conditions together creates a greater uncertainty in the supply chains and leads to the bullwhip effect phenomenon. It happens when the orders to the supplier tends to have larger variations than demands to the buyer such that the distortion propagates upstream in an amplified form. The major problems associated with the bullwhip effect are increasing the safety stock & carrying cost at each supply chain echelon, decreasing the customer satisfaction and inefficient production process at each echelon. So it is important to identify the bullwhip effect associated with the supply chain. One of the important step in analyzing the bullwhip effect is the quantification of bullwhip effect in an accurate way. Usually bullwhip effect is calculated as a ratio between order variance to the demand variance. This method is used by assuming that the variables in the data sets are independent and identically distributed. But in actual practice the data associated with the order and sales shows some kind of dependency within the data set. This independency is represented by a parameter known as Hurst exponent, which is identified by methods of rescaled range analysis, detrended fluctuation analysis (DFA) etc. A better method for the quantification and analysis of bullwhip effect is introduced in this paper by considering the independency of data set.
Keywords: Bullwhip Effect, Detrended Fluctuation Analysis (DFA), Hurst Exponent.
Abbildung in dieser Leseprobe nicht enthalten
Increasing demand from customers, high market competitions, fluctuating demand from customers and faster response, in addition to advancements in technology have turned the market into an unstable environment. All this conditions together creates a greater uncertainty in the supply chains and leads to the bullwhip effect. An integrated supply chain includes the purchasing of raw materials, the manufacturing with assembly or sometimes also disassembly, and the distribution and repackaging of produced goods sent to the end customers. Various operating stages in the logistic chain (nodes of the chain) can be represented by a simple model of some material-transformations or location-changes processing cells and arcs. In each processing cell, a value is added and some costs are incurred. At each processing cell there is a supply and a demand and often both are stochastic by nature. Inventories are insurance against the risk of shortage of goods in each cell of a logistic chain. They are limited by the given capacity of each processing node and also by the transportation capability of input and output flows.
Abbildung in dieser Leseprobe nicht enthalten
Fig.1.1 Supply chain 6
The bullwhip effect is one of the most popular concepts in the operations management/research field. The term ‘bullwhip’ was coined to describe the effect by which slow moving consumer demand creates large swings in production for the suppliers at the other end of the supply chain. This is analogous to the handle of the bullwhip causing a loud crack at the popper. The bullwhip effect is referred to as ‘demand amplification’, ‘variance amplification ‘or the ‘Forrester effect’.This effect becomes significant when the cost from fluctuations in production outweighs the cost of holding inventory.Over the years, evidence has suggested that bullwhip costs play a pivotal role in some businesses. Bullwhip costs can be associated with the setting up and shutting down machines, idling and overtime in the workload, hiring and firing of the workforce, excessive upstream inventory, difficulty in forecasting and scheduling, systems nervousness, and poor supplier/customer relationships, amongst their consequences.
Abbildung in dieser Leseprobe nicht enthalten
Fig 1.2 Bullwhip effect 6
The bullwhip effect appeared for the first time in literature as the subject in 1961 year 9.The author of the study noticed this effect of batches executed at bargain of simulation analyses. He determined this problem initially with name of increasing of demand. The problem of the bullwhip effect is resulting from the system according to it along with its policy, the organizational structure and delays in flows of materials and information, rather than is coming from the external sources. The bullwhip effect is defined as the effect of lack of the information exchange between components of the chain of supplies and of occurring of non-linear interactions which they are causing for the difficulty in administration with them. Other author in 1989 2 stated that it was lack of understanding in order giving support to the desire from the side of other participants in the supply chain and the irrational reaction is causing the rise of the bullwhip effect from the side of persons taking decisions up in the made systems. Since people have problems with analyzing impact of the decision about to order the system for the complexity and temporary delays between ordering and receiving, the insertion of trainings would be necessary from the range of the bullwhip effect for managers. From the other side the bullwhip effect is witnessing one way or another even if all participants in the supply chain are behaving in the optimal way unless the supply chain will be rebuilt along with various strategic with mutual relations - the bullwhip effect is able then not to occur 8.From examinations it is resulting that the bullwhip effect is resulting from four factors [5, 16].
Predicting of demand by every participant in the supply chain is an important factor. Forecasts of demand are being elaborated in every link of the supply chain on the basis of historic data and information about the changes taking planned publicity drives or other action shaping final consumers demand into consideration of demand. These forecasts are being modified after receiving orders from the clients. Every company is handling other output in reality that is utilizing distorted information about the market demand and it is taking supply decisions up on their basis. The long supply chain is amplifying the bullwhip effect with many intermediate links because it is occurring on every level increasing unstable demand, moreover the time of the information transmission and the time of material flows are being prolonged which means the longer response time for changes of demand on the retail market. The decision about liquidation of the part of the intermediate links has to be subjected to the detailed analysis comparing the added value to expenses by each agents of their functioning. It is able as a result of the reduction of unprofitable links in the new supply chain to rise.
Assembling orders and maintaining provisions are managed according to various principles. The policy depends on making of orders from internal procedures of companies. It is often relies on grouping orders and periodic ordering of big parties of commodities. Of reasons for such a procedure perhaps to be a lot of high expenses of the study and concatenations of the order, transport savings at full transport, discounts and rebates given by deliverers at the purchase of big parties of commodities and dictating the order to size by minimal deliverers, but also policy of the loaning business (e.g. the payment for commodities makes from the end of the month which, they were bought, in after 30 days clients wait with the concatenation of the order to the beginning of the next month). It is causing with the other person applying the accounting period to pushing out of orders by sales reps of the principle towards the end to realize the assumed sales plans. Such activating of sale is working in abrupt influx of orders at the same time, when demand is minimal through the rest of the period. It means this claiming by the majority of the period of heavy stocks and difficulties with in operating orders at the moment of the plurality of the demand. Assembling orders by various principles and of completing provisions and rational decisions are managing managers to the fact that the order isn't bringing for information about default demand but about demand from before a few days or so weeks, corrected for necessary size to the completion of provisions.
Deliverers are offering various promotions for customers periodically in the form of price or quantitative discounts, rebates, coupons, profitable dates of payment which price fluctuations are starting. Companies are reacting to these offers as a rule of ordering, without regard to demand, big quantities of products during the promotion. They are assembling the next order from exhausting at the moment oneself of provisions or during the next promotion. It is managing to big changes of purchases which are not reflecting actual demand reported by lower levels in the supply chain. It is being estimated, that 75% of the transaction refers to purchases between producers but with distributors in the food branch for the future because of the profitable price offer.
Understanding the causes of the bullwhip effect can help managers to find strategies and to mitigate it. Indeed many companies have begun to implement innovative programs that partially address this effect. Next we examine how companies tackle each of the four causes. We categorize the various initiatives and other possible remedies based on the underlying coordination mechanism, namely, information sharing, channel alignment, and operational efficiency. With information sharing, demand information at a downstream site is transmitted upstream in a timely fashion. Channel alignment is the coordination of the pricing, transportation, inventory planning, and ownership between the upstream and downstream sites in a supply chain. Operational efficiency refers to activities that improve performance, such as reduced costs and lead time13.The following steps helps to mitigate the bullwhip effect.
Ordinarily, every member of a supply chain conducts some sort of the forecasting in connection with its planning (e.g., the manufacturer does the production planning, the wholesaler, the logistics planning, and so on). Bullwhip effects are created when supply chain members process the demand input from their immediate downstream member in producing their own forecasts. Demand input from the immediate downstream member of course, results from that member's forecasting, with input from its own downstream member. One remedy to the repetitive processing of consumption data in a supply chain is to make the demand data at a downstream site available to the upstream site. Hence, both sites can update their forecasts with the same raw data in the computer industry, manufacturers request sell- through data on withdrawn stocks from their reseller's central warehouse. Although the data are not as complete as point-of-sale (POS) data from the resellers' stores, they offer significantly more information than was available when manufacturers did not know what happened after they shipped their products. IBM, HP, and Apple all require sell-through data as part of their contract with resellers. Supply chain partners can use electronic data interchange (EDI) to share their data. In the consumer products industry, about 20 percent of orders by retailers of consumer products was transmitted via EDI in 1990.The increasing use of EDI will undoubtedly facilitate the information transmission and sharing among chain members. Even if the multiple organizations in a supply chain use the same source demand data to perform forecast updates, the differences in forecasting methods and buying practices can still lead to unnecessary fluctuations in the order data placed with the upstream site. In a more radical approach, the upstream site could control resupply from the upstream to downstream.
The upstream site would have access to the demand and inventory information at the downstream site and update the necessary forecasts and the resupply for the downstream site. The downstream site in turn, would become a passive partner in the supply chain. For example, in the consumer products industry, this practice is known as vendor-managed inventory (VMI) or a continuous replenishment program (CRP). Many companies such as Campbell Soup, M&M/Mars, Nestle, Quaker Oats, P&G, and Scott Paper use CRP with some or most of their customers. Inventory reductions of up to 25 percent are common in these alliances. P&G uses VMI in its diaper supply chain, starting with its supplier, 3M, and its customer Wal-Mart. Even in the high-technology sector, companies such as Texas Instruments, HP Motorola, and Apple use VMI with some of their suppliers and, in some cases, with their customers. Inventory researchers have long recognized that multi-echelon inventory systems can operate better when inventory and demand information from the downstream sites is available upstream. Echelon inventory - the total inventory at its upstream and downstream sites - is key to the optimal inventory control." Another approach is to try to get demand information about the downstream site by bypassing it. Apple Computer has a "consumer direct" program, i.e., it sells directly to its consumers without going through the reseller and distribution channel. A benefit of the program is that it allows Apple to see the demand patterns for its products. Dell Computers also sells its products directly to consumers without going through the distribution channel. Finally, as we noted before, long resupply lead times can aggravate the bullwhip effect. Improvements in operational efficiency can help reduce the highly variable demand due to multiple forecast updates.Hence, just-in-time replenishment is an effective way to mitigate this effect13.
Since order batching contributes to the bullwhip effect, companies need to devise strategies that lead to the smaller batches or more frequent resupply. In addition, the counter strategies we described earlier are useful. When an upstream company receives consumption data on a fixed, periodic schedule from its downstream customers, it will not be surprised by an unusually large batched order when there is a demand surge. One reason that order batches are large or order frequencies low is the relatively high cost of placing an order and replenishing the order. EDI can reduce the cost of the paperwork in generating an order. Using EDI, companies such as Nabisco perform paperless, computer-assisted ordering (CAO), and, consequently, customers order more and more frequently. McKesson's Economost ordering system uses EDI to lower the transaction costs from the orders by drugstores and other retailers." P&G has introduced standardized ordering terms across all business units to simplify the process and dramatically cut the number of invoices. It expects to purchase at least $2 billion in materials through its internally developed Trading Process Network. A paper purchase order that typically cost $50 to process is now $5.24 Another reason for large order batches is the cost of transportation 13.
The differences in the costs of full truckloads and less than truckloads are so great that the companies find it economical to order full truckloads, even though this leads to infrequent replenishments from the supplier. In fact even if the orders are made with very little effort and low cost through EDI, the improvements in order efficiency are wasted due to the full truckload constraint. Now some manufacturers induce their distributors to order assortments of different products. Hence a truckload may contain different products from the same manufacturer (either a plant warehouse site or a manufacturer's market warehouse) instead of a full load of the same product.
The effect is that for each product, the order frequency is much higher, the frequency of the deliveries to the distributors remains unchanged, and the transportation efficiency of firm is preserved. P&G has given discounts to distributors that are willing to order mixed- SKU (stock-keeping unit) loads of any of its products." Manufacturers could also prepare and ship mixed SKUs to the distributor’s warehouses that are ready to deliver to stores. "Composite distribution" for fresh produce and chilled products uses the same mixed-SKU concept to make resupply more frequent. Since fresh produce and chilled foods need to be stored at different temperatures, trucks to transport them need to have various temperatures. The use of third-party logistics companies also helps to make small batch replenishments economical. These companies allow economies of scale that were not feasible in a single supplier-customer relationship. By consolidating loads from multiple suppliers located near each others, a company can realize that the full truckload economies without the batches coming from the same supplier. Of course, there are additional handling and administrative costs for such consolidations or multiple pickups, but the savings often outweigh the costs. Similarly, a third-party logistics company can utilize a truckload to deliver to the customers who may be competitors, such as neighboring supermarkets. If each customer is supplied separately via full truckloads, using third-party logistics companies can mean moving weekly to daily replenishments. For small customers whose volumes do not justify frequent full truck load replenishments independently, this is especially appealing. Some grocery wholesalers that receive FTL shipments from the manufacturers and then ship mixed loads to the wholesaler’s independent stores use logistics companies. In the United Kingdom, Sainsbury and Tesco have long used National Freight Company for logistics. As a result of the heightened awareness due to the ECR initiative in the grocery industry, we expect to see third-party logistics companies that forecast orders, transport goods, and replenish stores with mixed-SKU pallets from the manufacturers 13.
When customers spread their periodic orders or replenishments evenly over the time, they can reduce the negative effect of batching. Some manufacturers coordinate their resupply with their own customers. For example, P&G coordinates regular delivery appointments with its customers. Hence, it spreads the replenishments to all the retailers evenly over weeks.
The simplest way to control the bullwhip effect caused by the forward buying and diversions is to reduce both the frequency and the level of wholesale price discounting. The manufacturer can reduce the incentives for the retail forward buying by establishing a uniform wholesale pricing policy. In the grocery industry, major manufacturers such as P&G, Kraft, and Pillsbury have moved to an everyday low price (EDLP) or value pricing strategy. During the past three years, P&G has reduced its list prices by 12 percent to 25 percent and aggressively slashed the promotions it offers to trade customers. In 1994, P&G reported its highest profit margins in twenty-one years and showed increases in the market share. Similarly, retailers and distributors can aggressively negotiate with their suppliers to give them everyday low cost (EDLC). From 1991 to 1994, the percentage of trade deals in the total promotion budget of grocery products dropped from 50 percent to 46 percent.
From an operational perspective, practices such as CRP together with a rationalized wholesale pricing policy can help to the control retailers' tactics, such as diversion. Manufacturers' use of CAO for sending orders also minimizes the possibility of such a practices. Activity-based costing (ABC) systems enable companies to recognize the excessive costs of forward buying and diversions. When companies run regional promotions, some retailers buy in bulk in the area of where the promotions are held, then divert the products to other regions for consumption. The costs of such practices are huge but may not show up in the conventional accounting systems. ABC systems provide explicit accounting of the costs of inventory, storage, special handling, premium transportation, and so on that previously were hidden and often outweigh the benefits of its promotions. ABC therefore helps companies to implement the EDLP strategy 13.
Situations when a supplier faces a shortage, instead of allocating products based on orders, it can allocate in proportion to the past sales records. Customers then have no incentive to exaggerate their orders. General Motors has long used this method of allocation in the cases of short supply, and other companies, such as Texas Instruments and Hewlett- Packard, are switching to it. The sharing of capacity and inventory information helps to alleviate customers' anxiety and, consequently, lessen their need to engage in the gaming. But sharing capacity information is insufficient when there is a genuine shortage. Some manufacturers work with customers to place orders well in advance of the sales season. Thus they can adjust production capacity or scheduling with better knowledge of the product demand. Finally, the generous return policies that manufacturers offer retailers in aggravate gaming. Without a penalty, retailers will continue to exaggerate their needs and cancel orders. Not surprisingly, some computer manufacturers are beginning to enforce more stringent cancellation of policies 13.
It is contend that the bullwhip effect results from rational decision making by members in the supply chain. Companies can effectively counteract the effect by thoroughly understanding its underlying causes. Industry leaders like Procter & Gamble are implementing innovative strategies that pose new challenges: integrating new information systems, defining new organizational relationships, and implementing the new incentive and measurement systems. The choice for companies is clear: either let the bullwhip effect paralyze you or find a way to conquer it.It is necessary here to notice, that distorting information about the demand is also occurring inside companies entering to the composition of the logistic supply chain as a result of their internal policy and procedures. Phenomena are being visited to main reasons for it 23:
Taking rational decisions up by managers within the confines of their functional department instead of the department of production is aiming at producing long batches from the point of seeing the company and the supply chain, e.g. in order to reach the effect of the scale, is emphasizing in turn of cannons of using of the client for maintaining the high level of provisions to ensure the determined level of using of the client; Predicting within the confines of each departments instead of collecting at the level of the company and a lot of places of the decisions having impact of the forecast on execution; manipulating of the forecast of demand of purposes assumed in order reaching.
Low level of managers' knowledge about the bullwhip effect and about its influence on the administration with supply chain Internal procedures of the company which data are distorting about demand, e.g. which minimal size of the order fitted together by distribution centres and the factory unit, minimal production volumes minimizing provisions by the politician in order limiting expenses generated by them in the company; demand for resultant products is supposed as a result of the changes of demand to be provided thanks to deliverers' fast reaction rather than behind means of provisions of safety; with effect of the one is maintaining them by transferring of the responsibility for provisions and expenses for the deliverers.
When demand reported by clients is exceeding the supply (e.g. within a period of the promotion, in front of the expected with the price increase or the change for commodities of taxes and excises), the producer is rationing products that is the part of everyone is realizing orders in dependence on the level of provisions, e.g., if the supply is providing 70% clients are obtaining it out of demand 70% of what they are reordered 18. The clients aware of this procedure, wanting to obtain necessary quantities of the commodities which are overstating orders. When demand is stabilizing the part of orders, is being withdrawn at the same time orders are stopping flowing in 23. The image is effacing it about real for forming of demand, about the actual level of the provisions in the whole supply chain and is managing companies in the field of production programs and the allocation and resources centers for taking invalid decisions up by managers. It has this huge importance when entering new products into the market. It is hard for the producer to reassess or demand for novelties is resulting from the consumers interest whether it is the consequence of creating provisions in distribution channels (effect of the fill the supply chain).
A complicated system, such as mechanical wrist watch, is formed of numerous components - in some cases as many as one thousand - that are linked to each other. But at the same time, the system is quite deterministic in its nature. It cannot behave in an uncertain manner. It is certainly very complicated, but is also very easy to manage.
The complexity is inherent in the SC, in form of static complexity that is related to the connectivity and structure of the subsystems involved in the SC (e.g. companies, business functions and processes) and dynamic complexity that results from the operational behavior of the system and the environment of the system. The complex nature of SC adds to difficulty of managing the SC so that it becomes almost common sense to say SCM is about managing the complexity inherent in the supply chain.Understanding and analyzing the complexity drivers first, will be an effective way to proceed to develop a clear strategy to deal with the complexity. Complexity in a SC grows, as customer requirements, competitive environment and industry standards change, and as the companies in the supply chain form strategic alliances, engage in mergers and acquisitions, outsource functions to third parties, adopt new technologies, launch new products/services, and extend their operations to new geographies, time zones and markets21.
It is possible to distinguish between three types of supply chain complexity: static, dynamic and decision making. While static (structural) complexity describes the structure of the SC, the variety of its components and the strengths of interactions; dynamic (operational) complexity represents the uncertainty in the SC and involves the aspects of time and randomness. The static-dynamic distinction has been primarily used to study the complexity in manufacturing systems. A SC complexity driver is any property of a SC that increases its complexity. The classification of types of SC complexity (i.e., static, dynamic, decision making) corresponds with the classification of the complexity drivers according to the way they are generated: via physical situation (e.g., number of products), operational characteristics (e.g., process uncertainties), dynamic behavior (e.g., demand amplification), and organizational characteristics (e.g., decision making process, IT systems) 21.
Internal drivers are generated by the decisions and factors within the organization such as the product and processes design. These drivers are relatively easier to leverage since they remain within the span of the control. Drivers generated within supply and/or demand interface (in cooperation with suppliers /customers) are related to the material and information flows between suppliers, customers and/or service providers. These drivers are somewhat manageable since they remain within the span of influence and the level of coordination between supply chain partners plays a significant role when dealing with these drivers. Thus, power and trust mechanisms that affect the nature of supplier/customer relations are also important factors which need to be considered as the complexity drivers. External drivers are generated through mechanisms that the company has little if any control over such as market trends, regulations and other various environmental factors. Different approaches may be adopted to cope with the complexity drivers (e.g., for the internal-static drivers approaches may be: product modularizations, reducing the product variety, mass customization, business process reengineering). Decisions targeting any of the drivers may have a positive or negative effect on the other driver which then would shift complexity of the SC from one driver to another, preferably on which they have more control. The companies make use of this property when managing the complexity in their supply chains21.
The complex systems share some common themes:(i) They are inherently complicated or intricate, in that they have factors such as the number of parameters affecting the system or the rules governing interactions of components of the system; (ii) they are rarely completely deterministic, and state parameters or measurement data may only be known in terms of probabilities; (iii) mathematical models of the system are usually complex and involve nonlinear, ill-posed, or chaotic behavior; and (iv) the systems are predisposed to unexpected outcomes (so-called emergent behavior). This new science has an interdisciplinary impact in the fields of physics, mathematics, information science, biology, medicine, sociology and economy, and recently, there is a new attention to apply these tools also to the management science. In recent years, different theories, methods, approaches and schools have appeared in the studies of complex systems: (i) complex systems dynamics, (ii) self-organizing, (iii) chaos theory, (iv) complex adaptive systems, (v) cybernetics of system evolution, (vi) complex system organization management, and (vii) the approach of the philosophy of complex systems (ix) and (x) the school of complex networks. A fractal can be seen as an object or phenomenon under an invariant structure in different scales. There is no universally agreed definition of exactly what we should mean by a fractal but tow points are central: it should be an object with some type of non-integer dimension, such as Hausdorff dimension and it should be approximately (or statistically) self-affine (Mumford et al, 2002).
Fractal analysis is assessing the fractal characteristics of data. It consists of several methods to assign a fractal dimension and other fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from a phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of the science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered.Several types of fractal analysis are done, including box counting, lacunarity analysis, mass methods, and multifractal analysis. A common feature of all types of fractal analysis is that the need for benchmark patterns against which to assess outputs. These can be acquired with various types of fractal generating software capable of generating benchmark patterns suitable for this purpose, which generally differ from software designed to render the fractal art. Applications of fractal analysis include Heart rate analysis, Pathology, Geography, Diagnostic imaging, Cancer research, Classification of histopathology slides in medicine, Geology, Archelogy, Fractal landscape or Coastline complexity, Electrical Engineering, Enzyme/enzymology (Michaelis-Menten kinetics)etc.
The hurst exponent is used as a measure of long term memory of time series. It relates to the autocorrelations of the time series and the rate at which the correlation decreases. The hurst exponent is referred as the index of long term dependence. It quantifies the relative tendency of time series either to regress strongly to the mean or to cluster in a direction.
The value of H ranges from 0 to 1.A value in the range 0-0.5 indicates a time series with a single high value followed by a low value and vice versa. So it is a negative correlation(anti persistence).A value in the range of 0.5-1 indicates a positive correlation which indicates a higher value in the data set will be followed by a higher value and vice versa. A value of H=0.5 indicates a completely uncorrelated time series. A number of estimators of long-range dependence have been proposed.
Abbildung in dieser Leseprobe nicht enthalten
The oldest and best-known is the so-called the rescaled range (R/S) analysis popularized by Mandelbrot and Wallis based on previous hydrological findings of Hurst. Alternatives include DFA, period gram regression, aggregated variances, local Whittle's estimator, wavelet analysis, both in the time domain and frequency domain.
Table 1.1 Hurst exponent
Abbildung in dieser Leseprobe nicht enthalten
The basic Hurst exponent can be related to the expected size of changes, as a function of the lag between the observations, as measured by E(\Xt+T-Xt |2). For the generalized form of the coefficient, the exponent here is replaced by a more general term, denoted by q.
There are a variety of techniques that exist for estimating H, however assessing the accuracy of the estimation can be a complicated issue. Mathematically, in one technique, the Hurst exponent can be estimated such that: Hq = H (q), for a time series(t) (t = 1, 2,...) may be defined by the scaling properties of its structure functions 5q (j):
Abbildung in dieser Leseprobe nicht enthalten
where q > 0, j is the time lag and averaging is over the time window t>>j usually the largest time scale of the system.
Practically, in nature, there is no limit to time and thus H is non-deterministic as it may only be estimated based on the observed data; e.g., the most dramatic daily move upwards ever seen in a stock market index can always be exceeded during some subsequent day.
H is directly related to fractal dimension, D, where 1 < D < 2, such that D = 2 - H. The values of the Hurst exponent vary between 0 and 1, with higher values indicating a smoother trend, less volatility, and less roughness. In the above mathematical estimation technique, the function H (q) contains information about averaged generalized volatilities at scale (only q = 1, 2 are used to define the volatility). In particular, the H 1 exponent indicates persistent (H 1 > or anti persistent (H 1 < behavior of the trend. For the BRW (brown noise, 1/ / 2) one gets Hq = and for pink noise (1/ /) Hq = O.The Hurst exponent for white noise is dimension dependent, and for 1D and 2D it is H1Dq = U, H2Dq = -1.
For the popular Levy stable processes and truncated Levy processes with the parameter a it has been found that Hq = q/a for q < a and Hq = 1 for q > a.A method to estimate from non-stationary time series is called detrended fluctuation analysis. When is a non-linear function of q the time series is a system. Usually the method of rescaled range analysis is used.
The Hurst exponent describes the raggedness of the resultant motion, with a higher value leading to a smoother motion. It was introduced by Mandelbrot & van Ness (1968). The value of H determines what kind of process the /Bm is:
- if H = 1/2 then the process is in fact a Brownian motion or Wiener process;
- if H > 1/2 then the increments of the process are positively correlated;
- if H < 1/2 then the increments of the process are negatively correlated.
The increment process, X(t) = BH(t+1) - BH(t), is known as fractional Gaussian noise. There is also a generalization of fractional Brownian motion: n-th order fractional Brownian motion, abbreviated as n-fBm.n-fBm is a Gaussian, self-similar, non-stationary process whose increments of order n are stationary. For n = 1, n-fBm is classical fBm.
The bullwhip effect is one of the key areas managed in applications of administration with chains of supplies of examinations. It is representing the phenomenon where orders are trending to deliverers for being more diversified than what is being sold to buyers but consumer demand is deformed 16. This distortion of demand is being spread too for higher stages in the amplified form. High levels of provisions and weak level of using of the clients are posing standard symptoms of the bullwhip effect in the chain of supplies. Keeping production costs and provisions stable and the increase in main times are proving it additionally while margins of the profit and availability of products are falling 4.Presented empirical examinations carried out in literature of the subject is resulting that the total elimination of the bullwhip effect is able to raise the product profitability of about 10%- 20%, however decrease in the bullwhip effect is making the possible profitability height of about 5%-10%. Linking the elimination or decrease in the bullwhip effect to the reduction of the other property (e.g. of seasonality) is possible to obtain profitability higher of about 15%- 30% in dependence on the specificity of business environment. It is also believed that smoothing or amplification behavior may vary among different nations and cultures. Mollick (2004) described evidence of production smoothing in the Japanese automotive industry, where the production smoothing is more common due to the prevalence of Heijunka (levelling) and Just-In-Time manufacturing strategies. Shan, Yang, Yang and Zhang (2014) studied the bullwhip effect in China, finding that bullwhip was gradually being reduced 27.
For the reduction and elimination of bullwhip effect the primary and important step is the quantification of the bullwhip effect.so it is important to calculate the bullwhip effect accurately for further analysis.so in this paper a more accurate way of calculating bullwhip effect by considering the dependency of data set is used.
A manufacturing operation is a system made up of many subsystems, or parts. Due to the interrelationships of the many parts, academics and practitioners, alike, will often say that a manufacturing system is complex. But, usually they do not give a precise definition of complexity. Complexity is difficult to define precisely.
Senge (1990) defined complexity is that a complex system is one, which has a large number of parts, whose relationships are not simple. Note that the parts themselves, may be simple, but that their relationships are not simple. Complexity is different from subtlety and it does not exist where there is either a complete order or complete disorder. Subtlety exists when consequences of interventions are not obvious to most of the system participants. Subtle systems have few parts in them, but there are many layers of relations between the parts 20.
Cooper (1992) on the other hand, said that complexity has many parts with numerous relations between the parts, and each of these relationships is obvious. There is no complexity in systems, which are completely organized. While all of the relationships in a completely organized system are obvious, there are a limited number of relationships between the parts. There is also no complexity at the other end of the continuum where everything is completely disordered. All the elements in completely disordered systems act in an unpredictable way. While there may be many elements in a disordered system, but their unpredictability means there is no obvious relationship between the elements, so the system is not a complex one. Complex systems exist only when there are ordered relationships between the elements in the system. So complexity exists in those systems, which are on the continuum between complete disorder and total order.
Frizelle and Woodcock (1995) argue that measuring manufacturing complexity provides a useful metric for improvement. They argue that systems with higher complexity have more problems than systems with lower complexity. So, by measuring the system's complexity, the managers can identify problems in the system that are hindering the production flow. In addition, if a manufacturing complexity measure could be easily computed, it would allow researchers to compare different system designs / structures to one another. And, it would allow system managers to determine how system changes (e.g., policy and/or process changes). Finally, the measure will allow a precise measurement of how changes in complexity in one system influence other systems in the organization. They argued that in addition to validity, a useful complexity measure needs to be composed of separable, additive components. By being separable and additive the manufacturing complexity measure would then allow easy analysis of managers' interventions on the system.
Separating the complexity measure into two components simplifies the computation of the measure 3. One is used to measure the system structure and the other to measure the system uncertainty. They used static complexity as a measure of complexity due to the system design, while dynamic complexity was seen as the result of the uncertainties in the system while it is operating (e.g., machine breakdowns).They introduced an entropy based measure manufacturing complexity. They concluded that complexity depends up on the probability of different states .The entropy measure base their complexity measures on uses a base 2 log. They argued that the advantage of using a log was that it reduced the impact of adding one more item, part or relationship to the system. For example, the addition of one new end product to a firm that currently produces five end products increases that system's complexity more than if the same addition were made to a firm producing 100 end products.
Filiz Izik(2010) introduced a new method which is an entropy based method for measuring complexity in supply chains. Supply chains and supply chain management are important topics in the globalized world during the recent years. Therefore, analysis, measurement and reduction techniques of the increasing complexity of supply chains play an important economic role. In this paper the uncertainty of supply chain were measured based on the entropy method from a viewpoint of classical (Shannon's entropy measure) and new proposed approaches 7. The goal of this study is to give a formal approach which is able to measure the influence of the entropy measure on SCC environment. The results of the study seem to be interesting and meaningful. Shannon's entropy measure is used because it is simple and most widely used in uncertainty measurement. Of course, it is not the only measure for uncertainty or complexity respectively. Some other measurement methods could be studied in the future as well (such as Lyapunov exponent or fuzzy set theory).
Doktorarbeit / Dissertation, 234 Seiten
Bachelorarbeit, 63 Seiten
Forschungsarbeit, 83 Seiten
Forschungsarbeit, 76 Seiten
Forschungsarbeit, 40 Seiten
Masterarbeit, 74 Seiten
Forschungsarbeit, 40 Seiten
Forschungsarbeit, 14 Seiten
Forschungsarbeit, 14 Seiten
Der GRIN Verlag hat sich seit 1998 auf die Veröffentlichung akademischer eBooks und Bücher spezialisiert. Der GRIN Verlag steht damit als erstes Unternehmen für User Generated Quality Content. Die Verlagsseiten GRIN.com, Hausarbeiten.de und Diplomarbeiten24 bieten für Hochschullehrer, Absolventen und Studenten die ideale Plattform, wissenschaftliche Texte wie Hausarbeiten, Referate, Bachelorarbeiten, Masterarbeiten, Diplomarbeiten, Dissertationen und wissenschaftliche Aufsätze einem breiten Publikum zu präsentieren.
Kostenfreie Veröffentlichung: Hausarbeit, Bachelorarbeit, Diplomarbeit, Dissertation, Masterarbeit, Interpretation oder Referat jetzt veröffentlichen!