*WAR GAMES

Apr 01, 2001 9:30 PM  By

Would Waterloo have turned out differently if Napoleon had taken along a laptop? Perhaps not, but computer simulation models can serve as advanced tactical weapons — on or off the battlefield

When modern armies prepare for combat, they model their weapons systems, equipment, supplies, and deployment of troops. Commanders pit their strategic initiatives against the models of potential adversaries, then address and quantify tactical considerations such as response time, accuracy, and cost.

Whether they evaluate military strength or distribution center efficiency, computer simulations graphically demonstrate the attributes of your command and its current state of readiness for a variety of missions. Such simulations predicted a low-loss victory in the Gulf War. The model was incredibly accurate — much more so than any previous attempt at forecasting the results of a military engagement. This improved ability to project the outcome of extremely complex logistics operations is due in large part to the new generation of computers and modeling/simulation software now available.

Rules of engagement

Now that the dust has settled on the first wave of troops that invaded the Web-enabled frontier, some of the battle-weary soldiers have slowed their progress to reassess the mission. Others have bled to death while the venture capitalists that funded the first attack withdrew their support and retreated in the face of operating costs that far exceeded income. The survivors now face the task of mapping out a strategy that will reduce operating costs, increase productivity, and enhance customer service levels.

Many of the quickly adapted e-tailing distribution facilities in operation today consist of an arsenal of misapplied material handling weaponry made useful only by the blood, sweat, and tears of fatigued workers. The information systems supporting their deployment often fail to provide the real-time intelligence required to complete the mission. Field dressings cannot replace a comprehensive strategic plan supported by a lethally efficient, fully integrated combination of material handling infrastructure, information systems, and well-trained employees.

In this struggle to improve the bottom line, computer simulation modeling is an excellent weapon to help you achieve tactical superiority and establish strategic alliances with customers.

A model campaign

Modeling is a powerful analytical technique used to emulate and evaluate operating protocol and associated designs. It is the process of translating the relevant aspects of a system (real or proposed) into a numerical, graphic, or other logical representation that can be tested and manipulated manually or through computer-based applications. Computer simulation is just one of the many modeling alternatives available — others include network analysis, mathematical programming, heuristics, and computer-aided design.

Simulation is the process of developing a mathematical, computer-based dynamic model that will predict the performance of a design or duplicate that of an existing operation. In today’s warehouses, mechanization, automation, and real-time information technology have created a matrix of interdependent activities that is difficult to validate with traditional tools. While a spreadsheet calculation may be sufficient for confirming isolated issues, complex relationships among space, time, personnel, equipment, and information systems call for a powerful and fully relational evaluation tool — computer models.

Computer modeling is often used to determine if the system will fulfill design objectives and satisfy projected throughput requirements. It can alleviate system problems and enhance effectiveness.

Testing, testing

Perhaps the most intriguing benefit of employing a computer simulation model is its ability to answer a variety of hypothetical questions. For example, what if …

  • order volume or seasonality change?
  • layout or staffing changes take place?
  • smaller orders are placed more often?
  • the ratios of pallet, case picking, and piece picking shift from current levels?
  • procedures and WMS logic are altered?
  • orders requiring next-day air shipment increase?
  • batch pick bucket size or the release of wave order volume is adjusted?

Once a base model is set up, you can answer questions like these with authority and confidence. Many users of computer simulation update their models regularly (to match changes in their systems) and download current order volumes/mixes periodically to reevaluate efficiency and make the adjustments needed to achieve optimal performance.

When considering the use of computer simulation modeling, it is important to understand that distribution operations are not static entities. They are dynamic and evolutionary. Systems retrofits and modifications to procedures and practices occur frequently over time in response to changing business attributes such as consumer demand, purchasing patterns, SKU proliferation, and value-added services. Often, layers of temporary fixes (the proverbial Band-Aids, bubble gum, and duct tape) are applied over one another in an effort to “fight the fire of the day” (and get the product out the door). Because operations managers rarely have the luxury of time and staff to assess and implement long-term solutions, it is not uncommon for systems/procedures to become unrecognizable, illogical, or inefficient.

Computer simulation modeling is a validation tool. It is best used for fine-tuning and optimizing fundamentally sound designs. If you model the decaying operation described above, you will at best produce an accurate depiction of an unproductive system. The “garbage in/garbage out” theory applies well to computer simulation models. Unless you have exercised due diligence in creating a viable operation to model, the simulation will be nothing more than a pretty animation of a poor design. The following four steps are a prerequisite for a successful computer simulation modeling study:

  1. Analysis. The analysis phase of a warehouse/distribution center design project is the foundation upon which all subsequent steps are based. When studying established operations (as in consolidation, expansion, or systems reengineering studies), you would combine historical transaction with forecast information. Statistics commonly harvested for study include inventory and movement data by SKU, cubes, weights, order profiles, and the like, with practices and procedures documented and analyzed by function. For new ventures with no historical basis, designs must rely on forecasting and benchmarking techniques. The ultimate goal of this endeavor is to statistically define the design year, design day, and design hour requirements for each functional component of the operation.

  2. Concept. While the analysis phase is “science,” the concept phase incorporates “art” as well. Summary data output from the analysis stage aids in formulating high-level design schemes. The essence of this art is the application of only the most viable of hundreds of alternatives to each of the project-specific challenges. After material and information flows are mapped, operating protocol, systems, and equipment configurations begin to take form. High-level appraisal of quantitative and qualitative issues leads to the selection of promising concepts that progress to the design phase.

  3. Design. You would apply classic industrial engineering skills at this stage, aggregating and evaluating the findings of the analysis and concept phases, and refining the candidate concepts to optimize material/information flow, fully utilize cube, balance density and selectivity, minimize travel, provide cube/velocity-correct picking and storage modules, and so forth. The threads of each of the requisite operational traits interweave to form a cohesive fabric. You would then assess the designs rendered in this process and define the following characteristics: space, equipment, staff, capital costs, risk, implementation challenges, and training issues.

  4. Validation. This is the phase in which computer simulation modeling is most effective. Validation involves examining each of the decision points, assumptions, standards, estimates, interfaces, constraints, capacities, and rates, and assessing the accuracy or impact associated with each. Those items likely to affect the operational outcome considerably may be subject to sensitivity testing (quantifying the significance of the high/low swings that may occur as a result of manipulating variables). Once you identify the sensitivity for a particular item, you can test the range of variance against any of the systems that interface with it directly or indirectly.

Money talks

You need not hire a consulting firm to begin using computer simulation models. A number of off-the-shelf products are now available that are well suited for modeling warehousing operations. In fact, most consulting firms use these non-proprietary programs.

Software costs for fully loaded packages often range between $15,000 and $30,000 per user and may require a specialized graphic workstation for optimum performance ($4,000 to $10,000 per user). Fees for yearly licensing, updates, and training agreements may also apply. Typically, an industrial engineer adept at warehouse design would take about six months — with the appropriate training — to become proficient in using a modeling package.

If you intend to maintain, regularly update, and continue to use the model through the design year, it is probably to your economic advantage to handle it in-house. If modeling will not be part of your ongoing activities, a consulting firm will likely be better prepared to execute a simulation project with dispatch and cost efficacy.

Whether you choose to use internal or external resources for your modeling project, the methodology described below is germane.

Define your goals. Paramount to successful modeling projects is the establishment of clear, concise, and practical objectives. Tightly define the project goals and determine what portions of the operation need to be modeled. Some models may include all functions from receiving to shipping, whereas others may be crafted to emulate only the most complex or labor-intensive portions. Develop an understanding of which segments of the operation must be modeled in detail and which can be modeled at the macro level. Most modeling failures are the result of ill-defined goals and too broad a scope.

Collect and analyze the data. Before you start a modeling project, you must be sure that reliable data is readily accessible. Whether this information is historical or based upon forecasts, benchmarks, or standards, you must identify the sources early in the process. Aggregate and develop base data, including process documentation, layout design drawings, equipment performance standards, personnel time standards by function, inventory/movement profiles, order data, product characteristics, and throughput rates. Next, manipulate and analyze this information to prepare for model development. It is advisable to avoid using averages that can cause fault in a model. Instead, employ mathematical distributions (based upon sufficiently large historical samples), which are much more likely to represent peaking and seasonality variations correctly.

Create the model. Develop the programming logic and animation of the model, preferably with an object-oriented graphic simulation program. Make sure that the model emulates the structure and decision points that the operation requires. It is preferable to import an actual CAD file as the structural backbone of the model, as this will enhance accuracy and provide management with a means to visually identify and understand the process. Schematic diagrams will certainly work; however, they are not as effective in conveying the essence of the operation.

Organizing the process logic into modules is a highly effective technique. In the case of a model that depicts all aspects of an operation from receiving to shipping, demarcate the logic of the various functions so that debugging is easier and the model remains flexible for future reconfiguration. Experienced model builders enter variables rather than finite values into the programming logic, further enhancing the model’s ability to be easily modified. Perhaps the most important task within this step is to document all assumptions and data parameters. Without this information, it will be extremely difficult to forensically reproduce the basis of the model later when your memory of the process has faded.

Verify or validate the model. Compare the statistical output from the initial model to actual operating data or projections. Investigate deviations and check the model for errors in logic, unrealistic assumptions, and faulty data. Program animations may have the secondary effect of validating the model logic, vehicle paths, and accumulation. Models are often run repeatedly with corrections and refinements to each run in an effort to yield an accurate rendition.

Perform sensitivity analysis. Run the model and progressively change the variables and/or operating parameters at the end of each run. This tests the upper and lower limits of the system and its components (a virtual stress test). Record and analyze the results of each statistical run, and examine each operating scenario numerous times to ensure that all statistical deviations (based upon the number of variables) have been adequately tested and that the findings are reliable and accurate.

Develop alternatives. If design flaws become apparent in the preliminary model runs, start thinking about alternatives. The model may show that the proposed design’s throughput capacity is lower than the requirements in the design year. Or perhaps the model points out an operational bottleneck in packing or insufficient accumulation prior to induction at the primary sorter. You can easily develop and test retrofits to the model. It is much more cost-efficient to optimize the “virtual” system through simulation than to identify design deficiencies after implementation. Keystrokes are always preferable to cutting torches.

Select the best alternative. Conduct model runs with each of the alternatives and compare them to determine which one provides the best balance of costs, throughput, and service. Once selected, the best model must go through sensitivity analysis again before proceeding to the implementation stage.

Implement and monitor the results. Now is your opportunity to prove the accuracy of the model, compare it to actual warehouse performance, and modify it to reflect any changes. If you keep the model current, you can use it to fine-tune your operation and employ it as a benchmark against which to evaluate proposed changes to systems, equipment, and procedures.

Lawrence Dean Shemesh is VP/principal of Gross & Associates, a Woodbridge, NJ-based material handling logistics consulting firm. Shemesh specializes in operations design for warehousing, manufacturing, and distribution. He can be reached by e-mail at LShemesh

@GrossAssociates.com or by phone at (732) 636-2666, ext. 320.