03/23/2015 : 0 Comments
Gartner posted an article detailing the benefits of pull replenishment in Demand Driven Value Networks:
Demand-driven value networks (DDVNs) integrate processes and data to translate demand signals into a supply response that creates value and mitigates risk.
The manufacturing, retail, high-tech, life science and healthcare industries with demand-driven capabilities perform better in the long term than peers with traditional, cost-focused supply chains. These companies grow revenue faster, achieve more than 15% higher perfect-order rates and reduce inventory levels by as much as one-third. They leverage an outside-in view based on insights about customer value. They apply this view to their product portfolios, supply networks and service processes to deliver customer value and profitable growth.
Orchestration of value networks includes selectively collaborating with customers, suppliers and partners, as well as effectively managing trade-offs internally through cross-functional processes that synchronize product, demand and supply decisions for maximized value. Critical capabilities for orchestration include supply chain visibility, agile decision making in response to volatility and demand shaping to optimize profitable balance.
Invistics has been helping high-mix manufacturers and distributors design and implement their Demand Driven Value Network for over 15 years. Our Pull Design Workshop is a great way to get started using three steps:
1.) Reaching consensus on the type of pull strategy to be adopted, the production planning parameters that will be impacted and the process and tools to be deployed to ensure a systemic and sustainable implementation.
2.) Developing a relevant business case.
3.) Identifying processes and tools that will support the new approach on an on-going basis.
04/20/2014 : 0 Comments
Our last blog post provided a general summary of Kingsman’s Equation and how it relates to your manufacturing operation. Today we’re going to delve a little deeper into the equation to prove that when it comes to lowering the Average Queue Time (or Average Wait Time) of your resources, Utilization is King.
For every system whether it’s a single machine or an entire factory, the time a resource spends as raw material or work-in-progress can be divided into two parts.
- -Process Time. This is value-added time it takes per machine or machines to process the resource and churn out a finished product.
- -Queue/Wait Time. This is non-value added time the resources are set aside to wait at the queue of machine or bottleneck, on a machine’s setup, etc
Lead Time = Queue Time + Process Time. In most manufacturing systems, the Queue time can comprise 80-85% of the lead time. This is all non-value added time that should be reduced in order to maintain a Lean operation.
Now let’s take another look at Kingsman’s equation:
AQT = Average Queue Time.
p = Utilization, expressed as decimal
Ca2 + Cs2 = arrival and process coefficient of variations.
τ = average process time
so what does this mean exactly? First let’s look at some examples where utilization(p) is at .25, .5, .7, .9, and .99. For these examples, we’ll just assume: (Ca2 + Cs2)/2 = 1 and process time(τ) = 60 minutes.
Case 1: If p = .25, Average Queue Time (AVQ) = (.25/(1-.25) * 60 = 20 mins
Case 2: If p = .5, Average Queue Time (AVQ) = (.5/(1-.5) * 60 = 60 mins
Case 3: If p = .7, Average Queue Time (AVQ) = (.7/(1-.7) * 60 = 140 mins
Case 4: If p = .9, Average Queue Time (AVQ) = (.9/(1-.9) * 60 = 520 mins
Case 5: If p = .99, Average Queue Time (AVQ) = (.99/(1-.99) * 60 = 5940 mins or 99 hours
In this example since (Ca2 + Cs2)/2 = 1, the STDEV of the respective variables would have to equal the mean to get 1. While such high variability does exist in the real world, it’s not very common. But regardless, this example is enough to illustrate two very important take-aways.
1.) Despite such high variability in both arrival and process times. If the utilization of the machine(s) is low, as it is in case 1 or case 2, the Average Queue Time is still manageable. This is because in terms of reducing your AQT, machine utilization is by far the most important factor.
2.) As you can see from the exponential increases in AQT over the cases, it is very, very wasteful to run your machines at such high utilization because your Average Queue Time goes through the roof. Can you imagine an 99 hour Average Queue Time for each of your products? And the screeching of your customers or sales reps? No thanks.
One final thing I should note is that the AQT in Kingsman is not the exact AQT, but more of a likely upper bound of your real AQT. Though it’s accurate enough to get the point across.
04/10/2014 : 0 Comments
In queueing theory, Kingman’s formula states that the mean waiting time is given by:
AQT = Average Queue Time.
p = Utilization, expressed as decimal
Ca2 + Cs2 = arrival and process coefficient of variations.
τ = average process time
So what are the practical, manufacturing take-aways?
- The longer the average process time is, the more important is the queue length. For example a process that takes a minute, with a queue of 5, is much better than a process that takes an hour with a queue of 5
- Utilization is king. If utilization is low (~50% or less), arrival variation and process variation will have a small impact. If utilization is high (80% or higher), arrival and process variation will matter much more.
- service, and assembly, extra capacity may not be as expensive compared to it’s manufacturing counterparts. This is because it is easy to move/add employees between functions.
- Since many service processes are longer than manufacturing processes, variation is generally more tolerable in service than in manufacturing. In other words, it is generally much more effective to focus on reducing failure demand than variation when it comes to service organizations.
- Reducing time variation is generally less critical than reducing utilization. Utilization is affected by errors which generate failure demand or rework.
- Arrival variation should not be ignored. It is just as important as process variation. Can your arrival variation be influenced through salesmen, incentives, informing customers or reducing supply chain amplifications?
- The work release behavior of processes upstream of bottleneck is important. The smoother (less lumpy) the orders come in at those work stations, the faster the bottleneck will process the orders.
Additionally, the Theory of Constraint advocates 5 steps of improvement: identify, exploit, subordinate, elevate, and repeat. Kingman’s equations gives these additional insights:
-To identify the constraint (bottleneck), it is easier to examine only the load and compare it with demonstrated capacity. Kingman shows that overload begins at less than 100% utilization, and that sensitivity variations is particularly important at high utilization.
-Exploiting constraints should include variation reduction in addition to other methods such as buffering with Inventory.
-Subordinating other resources should include looking upstream of the constraint to examine arrival variation coming into the bottleneck.
Kingman’s equation ties together Lean, 6 Sigma, TOC, and service systems to show that there’s not a one-size fit all when it comes to adding value to your operations. The key is in knowing when to apply each school of thought.
(Be sure to check out John Bicheno’s “The King of Equations” in the Lean Manufacturing Journal for more information and tips related to Kingman’s equation).
03/03/2014 : 0 Comments
Myth#1: Lean is a huge initiative that drains financial and human resources.
Believe it or not, many companies have opted to forgo Lean because of this very myth. The thought of overhauling their whole process puts them into a panic: trying to find where funding will come from and how much will it cost to train their employees.
In these economic times, everyone is looking for ways to cut corners and save. However, companies shouldn’t worry about Lean processes draining their financial, or even, human resources. The interesting truth about Lean is that, if processes are implemented properly, it will not cost as much as previously feared. In fact, some companies have reported a substantial return on investment within the first year.
To conserve financial and human resources and proceed with a Lean implementation, the best thing is to start with a fast, low-effort pilot that will quickly make up for the resources used in the project. The benefits and savings that a company will see within a few months will ease financial fears and pay for any future expansion of Lean.
Has your company overlooked Lean processes because of this misconception? Have you tried Lean and run into any strains, either financially or worker related? We would like to hear your input. Let’s discuss!
01/15/2014 : 0 Comments
Leaders, competitors, and followers exist in all industries, and especially in manufacturing. How does your company rank? Is your organization operating on the model that others watch — or are you following others? Find out where your organization stacks up with this quiz.
Note: Only add the corresponding points if your answer to the question is “Yes”.
1. Your company involves individuals from all business segments (including suppliers) in its business plans (add 1 point)
2. Your company focuses mainly on manufacturing areas & includes only a portion of those in aligned areas (add 3 points)
3. Your company focuses on the plant floor with weak support (add 5 points)
4. Your company has a broad use of metrics — some are automated and some are manual. (add 3 points)
5. Your company has a broad application of methodologies (i.e. Kanban, 5S) with wide access to real-time data (add 1 point)
6. Your company uses few Lean tools and has limited access to manufacturing data (add 5 points)
7. Your company’s business processes are embedded in Manufacturing Execution Systems which ensures consistency (add 1 point)
8. Your company applies Manufacturing Execution Systems and continuous improvement programs (add 3 points)
9. Your company’s metrics are calculated frequently by IT systems with high accuracy and credibility (add 1 point)
10. Your company has a several Lean tools and good access to manufacturing data (add 3 points)
11. Your company has a manual Business Process Management using tribal knowledge and is skeptical about new technology (add 5 points)
12. Your company has a few metrics that are manually calculated and not usually actionable (add 5 points)
If you scored 15 to 20 points, your organization is considered a Follower and there is room for improvement. Invistics offers resources such as free white papers, executive briefs, and hosts monthly webinars at no charge to help you achieve Leader status.
If you scored 9 to 14 points, your organization shows characteristics of being a follower and a leader, a status known as a Competitor. You are making the necessary steps toward Leader status. To help you on your path, check out the Invistics website for free information and monthly educational webinars to help you become a Leader more quickly.
12/06/2013 : 0 Comments
Steven Kuehn, Editor-in-Chief, recently published a great overview of the progress the Pharmaceutical Industry has made in the recent years in area of cutting waste and improving operations to be “Lean”: http://www.pharmamanufacturing.com/articles/2013/honing-the-competitive-edge/?start=3
It’s only been within the last 15 years that the Pharmaceutical Companies have begun to take the concepts of Lean and Six Sigma more seriously, mainly due to major cost constraints that were tightening profit margins across the industry. Kuehn writes:
“As mentioned, since approximately 2002 the pharmaceutical industry felt increasingly pressured by a number of new and relatively dramatic external and internal market forces (think patent cliff) and began to face up to its profligate spending sustaining inefficient drug development and manufacturing processes. With the cost of bringing a single blockbuster drug to market reaching some $1.3 billion and, according to Eli Lilly, the success rate of new chemical compounds falling from 12% a decade ago to 8% today, drug manufacturers indeed continue to have a tough fight ahead to remain competitive and sustain commercial success.”
However, despite the industries best efforts to reduce waste and variability, Lean hasn’t quite caught on. There were still many pharmaceutical professionals who believed Lean to be a cost cutting methodology. Nigel Smart, prinipal at Smart Consulting Group and author of “Lean Biomanufacturing” says:
Nigel Smart, principal at Smart Consulting Group and author of the just-released book, Lean Biomanufacturing, says, “For perhaps over a decade now, the pharmaceutical/life sciences industry has been attempting to apply Lean systems to its various process systems. Initially, there were attempts to apply the principles to manufacturing processes in an attempt to mimic the advantages seen in other industrial sectors, such as those in the auto industry. However, if one was to critically prepare a performance scorecard of the implementation of Lean throughout the [Pharma] industry, this analysis would at best give you a normalized score of perhaps 4/10.”
The article is worth reading in full to see who and what others in the Pharma Industry are doing to achieve best-in-class manufacturing operations using Lean Six Sigma methodology.
12/06/2013 : 0 Comments
Invistics has been providing educational Webinars on a variety of Lean Manufacturing topics since 2011. We strive to ensure our webinar presentations are up to date with the best practices utilized in the high-mix manufacturing world. Check back in every month for a variety of topics ranging from case studies, live demos, and best practices in inventory & lot sizer optimization (and more!). This month we’re offering two topics:
Join Tom Knight, Invistics’ CEO, for our new webinar, an in-depth case study on Dupont Pharmaceuticals. This particular facility was plagued by high-variable processes and products which lead to increased cycle-times, inventory shortages, and impacted customer service levels. Learn how Invistics put in place Lean processes that realigned product flows, established key metrics, and integrated this approach with IT systems in order to reduce cycle times, lower inventory levels, and improve service.
Date: Wednesday, December 18, 2013
Time: 1:00 PM (Eastern Time)
Duration: 60 minutes
Join Xing Guan, Process Improvement Consultant, to learn Best Practices for Lot Size Optimization for high-mix manufacturers and see how industry leaders analyze & calculate the “sweet spot” between changeover costs, inventory costs, and customer service. The presentation will include a detailed case study of the project Invistics did for a major manufacturer as well as a live demo of Invistics Lot Sizer optimization software.
Date: Thursday, December 19, 2013
Time: 1:00 PM (Eastern Time)
Duration: 60 minutes
Register Here: http://www.invistics.com/resources/webinars/
10/10/2013 : 0 Comments
Invistics CEO, Tom Knight, presented as the Keynote Speaker at the annual 2013 IIE Lean & Six Sigma conference held in Atlanta, Georgia. Tom presented on the topic of Lean Supply Chain with an emphasis on Pull Design for High-Mix Manufacturers.
Check our monthly Webinar page to see which Lean topics Tom and Others will be presenting in October!
09/13/2013 : 0 Comments
– Excessive working capital
– Suboptimal lot sizes led to:
- Excessive cycle times due to large lot sizes for many SKUs
- Too frequent changeovers due to small lot sizes for some SKUs
- Capacity concerns due to high utilizations
Bayer Material Science’s Baytown facility was challenged to drastically lower working capital. Bayer initially tried using simple inventory analysis (include a link to the Inventory Advisor webpage), but quickly realized that this approach would fall short due to its lack of understanding of their critical capacity issues. One factor contributing to the high levels of required working capital were suboptimal lot sizes throughout the operation. Making this particularly challenging was the fact that some workcenter utilizations were at or near 100%- requiring excessive overtime to meet customer orders.
The team at Bayer realized that this excessive working capital could be controlled if they could take advantage of their knowledge of how lot sizes related to overall costs. As the familiar figure below shows, there are at least two dueling factors to consider when determining an optimal lot size:
1) A fixed component that increases linearly with the lot size. This is due to warehouse and logistic costs as well as holding costs. If only this component were considered, we would always try and have the smallest lot size possible to minimize overall costs.
2) An component that decreases exponentially with lot size. This component is due to yield loss, quality loss and unit reliability costs. If only this component were considered, we would always try and have the largest lot in order to minimize our overall costs.
In the simplified example above, the trick is to find the optimal point that balances both of these components and finds the lot size that minimizes our overall costs.
The challenge is that while these tradeoffs are easily understood, they are exceedingly difficult to actually sit down and calculate for a system of materials running through shared workcenters. In other words, like many manufacturing challenges, once a manufacturing process becomes ‘high mix’, simple techniques and approaches become significantly more challenging to implement. To be specific, the cost functions that describe the curves shown in the Figure above lack any reference to capacity, flow congestion, starvation, customer service level, or any of the needed factors required to make a fully informed planning decision. Clearly, a more sophisticated approach was needed to find the ‘sweet spot’ for Bayer’s lot sizes.
Step 1: Model the flow using Invistics’ Lot Sizer
The team turned to Invistics to assist with the calculation of these optimal lot sizes. Invistics’ ‘Lot Sizer’ uses a unique and patented approach to find this sweet spot between inventory, service, and changeover costs while taking the effect of increasing utilization into account.
Gather Required Data
The first step in the process is to identify the needed set of inputs for Lot Sizer. The diagram below provides a summary:
Using these inputs, Lot Sizer was able to model the flow of materials (including into and out of various storage tanks) through Bayer’s facility.
Step 2: Perform Analysis in Lot Sizer
One the data model was built, Bayer was able to analyze the results in Lot Sizer.
Figure 1: Capacity Analysis from Lot Sizer
The first recommendation highlighted by Lot Sizer was for optimal lot sizes for the Packaging Lines. As is often the case, the current lot sizes in SAP were created years ago using a ‘rule of thumb’ – the specifics of which were now unknown to the team. As the table below shows, Lot Sizer’s optimal values found that some SKUs currently had lot sizes that were smaller than ideal, while others had lot sizes that were too large. The result was an inefficient use of drum line capacity, which resulted in excessively high overall costs.
Figure 2: Partial list of output from Lot Sizer
In fact, even though the average lot size was recommended to decrease, as the figure below shows, there were so many examples of SKUS that had smaller than optimal lot sizes, that the small increase in inventory holding costs from these increased lot sizes was dramatically offset by the cost savings from the expected reduction in setup costs.
The figure below is a partial, but indicative representation of the recommendations from this project. As we expanded to the full product line, similar opportunities were seen for large inventory and overall cost reductions.
Step 3: Confirm Key Inputs as Needed Using Sensitivity Analysis
While most of the needed inputs were straightforward, at Bayer, some required a bit of additional analysis. In particular, the minimum allowable batch size (aka ‘lot size’) was an input that was up for debate. By running Lot Sizer multiple times, allowing varying ‘Minimum Batch Sizes’, the team was able to analyze the sensitivity of the results to the changing input (See Figure below). Doing so helped them realize that the range of their discussion (min batch size somewhere between 5,000 and 10,000 kg, with a best guess of 8,800) wasn’t particularly sensitive to the input, so it was determined that their best guess would work satisfactorily.
Step 4: Implement Recommendations
- At this point, the team decided to move ahead with the recommendations provided by Invistics’ Lot Sizer. Somewhere between 80-90% of the recommended lot sizes were implemented (The other 10% required some small tweaks by the team before putting into place). To implement the lot sizes, the Bayer team simply took the lot size values and entered them into SAP, which then used these improved values in its usual MRP and APO runs.
- For the additional tank capacity recommended during the what-if analysis, the team again decided to move ahead and implement the changes. In this case, this meant dedicating multiple tanks from the tank farm to certain products, rather than having all tanks free to cycle between all products. Here, the change didn’t end up costing anything as the team was able to accomplish the change via a simple management policy rather than through any piping or valving changes.
Within the next year, running with these improved lot sizes, the Baytown facility inventory levels decrease by $4.5 million without any sacrifice to customer service. These improvements were all realized by simply changing the lot size values in SAP to the values calculated by Lot Sizer.
In addition, overall costs were decreased by $500k annually. This improvement was due to a combination of the improved lot sizes and reconfiguring of the tank farm as recommended during the project.
At the time of this writing, Bayer is exploring the expansion of these techniques across their global supply chain.
06/12/2013 : 0 Comments
Over the past two years, having spoken to countless Inventory Managers and Supply Chain professionals, the same pain is being voiced again and again. It’s particularly acute in high-mix environments where there’s often a long tail of product variety & high-SKU count, and the additional challenges of having shared equipment in your production environment with a long, extended Supply Chain.
The pain is this: there’s too much inventory of the “wrong” stuff, and not enough of the “right” stuff. The fallout is working capital is being tied down in stock that’s often sitting wastefully on a warehouse floor, taking up space, and leading to higher percentages of obsolescence, damage, or even theft. On the other end of the spectrum, often simultaneously, we see that other SKUs aren’t being stocked enough. This is much bigger issue as it can directly affect your customer service level and lead to firefighting & expediting or, worst case, a lost sale.
So how do you ensure your customer service performance in a high-mix environment without loading your stocks to the sky with excess inventory?
Over the years we’ve seen a variety of Inventory Management techniques. On the low end, we see what we call “acoustical” inventory management where whoever is shouting the loudest gets the orders pushed through. Don’t need to be an expert to realize this often a stressful way to manage a business and leaves a lot to be desired in terms of root analysis of where the real issue lies.
The most common form of Inventory Management is “Rule of Thumb.” Where a company will hire an experienced supply chain professional to analyze previous sales history, often comparing new SKUs with similar old SKUs in order to make a good educated guess how much they’ll need to stock for their goods. Often with enough effort, trial and error, and Excel magic, an Materials Manager might get pretty close to optimal levels when stocking a good. However, as the SKU count grows, as new products are introduced, as seasonality rears it’s ugly head, it becomes harder and harder to maintain such an approach. And what happens when that experience professional retires or leaves for greener pastures?
The real problem however is these approaches often neglect a key input that’s necessary for accurate Inventory Management: Variability. A SKU with an average forecast or demand history of 50 units a week with a standard deviation of 5, can and should not be managed the same way as a SKU with an average demand history of 50 units a week with a standard deviation of 100. The coefficient of variation must be considered. The same is true for the supply side, where replenishment or production lead-time is often a fluctuating rather than a static number. This is the reason manufacturers are seen often purchasing from more expensive but closer-to-home vendors, rather than cheaper vendors in southeast Asia, where the variability of lead times if often much much greater threat than the cost of potential savings.
With this variability, and with the challenges of a massive SKU count in the thousands, the answer to good inventory optimization often lies outside Excel spreadsheets and with more complex, more robust forms of Inventory Analysis. And despite the LEAN principle of Keeping it Simple, often times software is simply necessary to crunch a large data sets of numbers, to account for variability in both Supply & Demand, and to return the optimal Inventory levels that will let you operate with high customer service levels and at the same time, just enough Inventory to meet those levels. No more, no less.
For more information, visit our Inventory Optimization page, and see if our company Invistics might be able to share our experience, and our software tools to lessen the burden of searching for and hitting that sweet spot in right-sizing your stock.