In the "world economy" and especially in rapidly changing markets, production labor costs are often a relatively small component of the final cost of a product. Cost of design, engineering, financing, marketing and distribution all compete with the actual manufacturing costs. On a large project with a short life-span, like a computer chip or an automobile, the design alone often costs over 70% of the overall cost of building the product. As previous course material has indicated, if the production labor amounts to less than 20% of the total cost of the product, it is hard to justify enormous expenditures to reduce or eliminate this cost (until equally rigorous cost control measures have been applied throughout the organization) unless this investment provides other, more important benefits as well.
For this reason, in a general purpose operation, the most cost-efficient application for computers is still in the integration of manufacturing information, not in the direct or indirect control of manufacturing cells. The best application of CNC cells appears to be in the area of tool manufacturing and tool and fixture development.
The need to "keep up" with our competition is obvious, and when we are not keeping up, the fact is clear to everyone. This manual provides the structure to develop rudimentary tools for analyzing manufacturing costs and scheduling resource allocations. These tools allow you to support basic decisions with information as well as intuition.
On the surface, a CNC mill (at $50,000 financed over 5 years) costs a little less per hour than an employee and may appear to be less troublesome. The problem is that it is much harder to keep a machine working than it is to keep a person working. Just like another employee, the machine requires that the rest of the manufacturing and support system (in this case the information management system) be in place to utilize the power it offers.
Before we accede to the pressure to computerize our operations, and install CNC tools to replace some of our workers with machinery, we must look at all the areas where our operation fails to measure up, and prioritize our efforts.
The most basic problem you will face in incorporating computers into your own business is that no one is really very good at using them. This statement is not limited to you and your employees. Not even the teachers you will hire to teach you, or the group-leaders at training seminars, know what you need to know or what you need to do. There is no standardized curriculum in place to teach, and there is little structure in place to certify proficiency. In fact, the entire field of computer training is currently exploding as the enormity of the market becomes apparent to entrepreneurs of all stripes. The problem you face is not simply learning. It is in figuring out what you need to learn. One reason for this is that the whole field is a series of moving targets. Many of the specific computer skills one develops are rendered obsolete when a new version of your application software or of the operating system is released. Sometimes the changes are so enormous that all that remains of value, after you install the new version of the program, is the general knowledge of how to approach problem solving you have gained in other areas of life and the knowledge that the answer is probably buried in the manual, somewhere.
It can be seen that there is a clear need for objective skill assessment to precede skill training, and thus focus the training effort, especially when the training is intended to provide computer skills.
Training program development, needs evaluation, baseline skill assessment testing, curriculum and Certification / proficiency testing for MTC could be developed for WoodNet under separate contract.
Modern programming languages and strategies are based on the creation and reuse of program "objects". Most programming operations can be seen as event driven -- as "programmed" responses to a range of expected changes in the input. Reusable programming "objects" can be defined and constructed that respond to changes in their inputs with changes in their outputs, and these objects can be linked together to create complex and responsive systems. The justification for the shift to a new language is usually the same one that motivates shifts in other areas: programming and debugging complex systems can be done faster when these tools are used.
However, an even more important use of programming objects has emerged. Object-Oriented "Enabler" Systems have been developed which are intended to "enable" non-programmers to develop or modify the performance of the software. This satisfies two important needs of the manufacturer. First, it allows process control to be very responsive to changing conditions, and second, it reduces cost by allowing multi-functional employees to make adjustments in the control software.
In a large, continuous process operation, such as a chemical plant (think of a brewery or paper-mill) or a steel mill, minute adjustments are made at many points along the process in response to changing internal and external conditions and to observed conditions at various points along each production line. Relationships can be established between conditions at various points and changes observed at any of these points can trigger the initiation of changes at other points.
It was once believed that these relationships could be automated. Such automation of processes implies the removal of human intelligence from the process and the substitution of a system of hard-coded rules. The presumption was that by replacing human intelligence with a system of rules, that errors could be reduced and quality would rise. However, this is rarely the case. Humans make up the rules, and in systems of any significant complexity the relationships are difficult to even map, let alone define all the possible ways factors are allowed to interact.
To make matters worse, and success even less likely, in rule-based automation, software engineers with little real manufacturing experience ended up coding the rules (because the awkward languages in which the lists and databases were developed were too cryptic for normal people).
The development of Expert Systems addressed this problem by demanding that a higher level of committee review be applied to the rules. The software expert programs the rule, but the rule content and operation is validated by an independent panel of experts. Unfortunately, this methodology has driven the cost of implementing expert systems and knowledge-based engineering out of the reach of the smaller companies that need it most.
As computers have become more powerful, process control languages have become less cryptic and more accessible. Not long ago, process control software was run from punch cards, and then from magnetic tapes. Programs were written, debugged and run by programmers. There was no place in the operation to make changes on the fly. Nor was there any place to make "continuous improvements". The process was broken down into discrete steps and each was "hard-coded". The cost of such coding is very high. A good programmer might average as few as 15 to 20 lines of "debugged" working code per day. This "glacial" pace is hard to match to the real world. Over a period of six months, in response to changing conditions, a program might need to be revised many times. At 20 lines a day, it simply never gets finished. This is how programming at major aerospace companies often works: you spend 3 or 4 years of your life on a project, and it never gets finished. It just gets canceled.
During the late 1970s and 1980s a new software paradigm emerged: interactive computers. This led to a major change in the way software was written, made possible by increases in performance in microprocessors and fueled by reductions in the cost of memory. These factors allowed the development of enormous computer programs with ever increasing "accessibility". The underlying assumption is that it is better in the long run to decentralize the control of information in the organization.
Now, manufacturing engineers can run today's enabler software. The intent of enabler technology is to allow manufacturing engineers, rather than software engineers, to "program" computers which control the manufacturing process. This allows their knowledge of the underlying process to be applied to the operation without having programmers translate what they understand of the manufacturing process into "C" or into computer code.
Today, tools can be built using the new interactive process control and simulation packages which will allow the return of substantial amounts of operational control to the point of manufacture on the poolroom floor. This view is borne out in large companies around the world, where more and more control of the overall process is being returned to the operators, and the focus of automation is moving from production to management. This is a major departure from previous approaches to the application of computers in manufacture, and it will change your world. The cost of simulation technology is falling rapidly, and the ability to develop a model to explore the impact of subtle changes in the operation of a small factory is now within reach.
One of the most important capabilities of simulation software is the ability to create a Virtual Factory. The Virtual Factory provides the ability to actually run simulated processes and to examine their output at a level of precision that is equal to (if not higher than) what we have available in the real world. The advantage of modeling and running a process in simulation rather than in the real world is that you can identify and rectify problems before they arise, rather than having to confront them after resources have been committed to the development of hard tooling and retraining your work-force.
In a production situation, the sort of process change you may wish to model might involve the impact of raising Q/A standards. Returned goods are a catastrophe in any operation, but the cost of eliminating them has to be kept under control. Statistical analysis of the returned goods has identified the primary sources of defective merchandise. Employee training is initiated, the operation of these work centers are slowed, and inspection stations are added down-stream from work centers where the errors occur. What are the probable impacts of this change on overall throughput, and at other stations on that line?
If the rejected items are fed back into the line for rework, the workload increases substantially at the point of reintroduction. This can lead to a variety of problems, including an increase in defective parts. Exactly what will happen cannot be predicted, but probabilities can be assigned, based on past experience.
In the most basic sense, a simulation is a very logical extension of the PDM model presented previously. You begin with a blank page on which you must place every step in the manufacturing process, no matter how small it appears. Then you "thread" all these operations together into a chain of events, and assign times to each operation and to the transit times between operations. Some of the transit times are artifacts of the arrangement of your production operation, others serve essential "buffering" functions, like the film-loop on either side of the shutter on a movie projector. In either event, all the steps must be charted. From this, a flow-chart is developed, and the various operations are classified according to their functions, based on inputs and outputs. For example, an operation with two possible outputs constitutes a "branch" and can provide a logical test, depending on what arrives at its input.
Mathematical analysis of the interactions of the wide range and large number of possibilities that exist will often defy seat-of-the-pants solutions, but a simulation tool can be created to apply logical tests, and when conditions that are likely to cause defects arise, log a defect. The simulation process can then be run through thousands (or millions) of cycles, and patterns of defect-causing conditions can be identified. The simulation can then be modified in various ways in the course of attempting to eliminate the defects.
Attempting to revise production to eliminate defects without simulation software is very expensive. In the class, we looked at a simulated warehouse, and considered the cost of moving seasonal merchandise (tire chains and anti-freeze) from the active stock into "dead storage" in the spring, and then moving it back into the active stock in the fall, based on a comparison of the cost of moving it (and the equally seasonal merchandise it displaces) twice a year, versus the increased cost (primarily increased travel time) of working around it 8 months out of the year.
Modern simulation software allows graphical representation and animation of the manufacturing processes. The steps you have threaded into a process and flow-charted are then animated, and the animation can be compared to a videotape of the actual process in your plant. Once the simulation and the operation are functioning identically, the animation engine can be turned off, and the simulation can be speeded up. The end result of a simulation process is a detailed analysis of the operation of the production process, including identification of resource constraints, error propagation, and overall throughput limitations.
Modern simulation tools provide fast, graphical, non-mathematical support for this kind of decision making. Simulation of production processes is an essential function of an MTC. Access to simulation software clearly requires the development of detailed production information (steps taken, time consumed, sequence, precedence/dependence relationships between steps and processes) discussed in previous sections.
As discussed earlier, the gap between where you are today and where you have to get to in order to utilize information management systems, simulation tools, CAD software and CNC machinery profitably is large.
Where can you find the time to build the bridge? You cannot afford to buy tools that you know you don't have time to learn how to use, and you cannot afford to take the time to learn to use them. It's a Catch-22. You need to know how to do something, but you don't have time to read the manual. As stated repeatedly above, most people end up embracing the new technology in a way that actually endangers their bottom-line.
People buy electronic filing systems and abandon their paper systems. This is a dangerous mistake. In the real world you need to maintain redundant systems, and paper management methods will never really be obsolete. Even when you have everything organized in the machine, you will still want a wall chart to tell you where to start looking. You can print new pages for your Roladex from your Personal Information Manager (PIM) but it will be years before your computer can find a phone number faster than you can turn the knob on the Roladex.
The first step in getting ready to use CAD and CNC in your manufacturing operation is the translation / generation of shop drawings into CAD files. You cannot really proceed with CNC until this has been done. Creation of these drawings is probably going to be an expensive and time-consuming process. Many products go straight from conceptual sketches into production without ever turning into drawings. Likewise, design changes are made on the fly and the drawings are never revised to reflect the changes. Even extracting dimensions from paper or mylar drawings is problematic. It is definitely a process you will want to get help on, and it may include some steps that you want to hire out. There are many highly-trained CAD operators who can be of great use to you, both as "gurus" and as "proof-readers". It is absolutely essential that you produce drawings that the rest of the world can use. Unfortunately, this is not as easy as it ought to be. The first problem is precision. The measurements you extract from your drawing must be used to locate control points that provide the basic geometry of the items you are trying to describe. This is not necessarily easy or obvious.
As you move into the larger world of CAD (the universe outside your own computer), documentation is crucial. In spite of the fact that "export" formats, such as DXF files allow files to be moved from package to package and version to version, CAD files are fairly "sterile" and difficult to understand. They contain the geometry (dimensions, angles) of the design, but they are necessarily reductions of the information that supports the geometry. There are many aspects of your drawings that will require substantial clarification before someone looking at it can figure out what it represents or how best to program a CNC router to cut it out.
CAD programs use a method of representing lines that is somewhat foreign to normal understanding. Lines are represented as "vectors". Vectors are imaginary lines connecting defined points. Unlike the "lines" we learned about in plane geometry, there are not "an infinite number of intermediate" points along these lines. As a result, a vector line can be represented very efficiently: starting from a point, the system gives a link, a distance and a bearing to the next point.
Each segment between points is a separate element. In many cases, this means you will have 2 or more points at every point. This has many important implications when you begin to turn these files into CNC instructions. It is absolutely essential that the points that define the geometry "snap" to the exact locations, without gaps. Maintaining this standard is not easy. You cannot tell (at screen resolution) if your line is continuous.
Normally, the first step in providing clarity is annotation. Parts are labeled using text. This is often where the problem starts. You have used fonts that are not on the other person's machine, so they cannot readily connect your explanation with the details you are trying to explain. Still, generating a text file, as you generate a drawing, is an essential activity that no one really does, but that everyone knows they ought to do. A lab-style notebook next to your machine could be worth its weight in gold, if you use it to keep track of what you did and why. Software is available to track revisions, but the cost is very high (more than the cost of a good CAD package).
CAD systems generally provide "rendering" tools which allow "hidden line removal". A 3-D representation of a cabinet in CAD in an Isometric or Orthogonal view would look a bit like a cabinet assembled from Plexiglas, except that there would be an overwhelming preponderance of overlapping lines in some areas. Hidden line removal calculation normally works from back to front, calculating which "planes" lie in front of (and thus obscure) other "planes" in the drawing, and removes the lines which ought to be invisible. This is an essential function of "rendering", but is irrelevant to the geometric description needed by a CNC tool. What the tool cares about is the closure of all lines. Lines in CAD that cannot be defined as simple vectors can either be defined as ARCS, SPLINES or POLYLINES. Arcs and Splines are mathematically described. Polylines are assemblies of lines, angles and arcs.
In complex drawings or large projects, the only reliable means to achieve maintainability in your drawings is through the use of a data dictionary. This document, normally maintained in a database, contains the history of the file, and the basis of the dimensions and other information in the drawing. It ought to lead to the name and physical location of all source material, the name and phone number of the person who input the data, the dates of all revisions, and what was revised, as well as technical information such as the layer assignments and fonts used.
Imagine a CAD file representing a drawing for a structure made from wooden parts, held together by interlocking corner details. To avoid having to redraw the corner details again and again, the CAD programmer develops a library of standard details from which drawings and parts descriptions are constructed, including corners.
These details can be called up and connected into larger objects. They can also be stored with connections to databases of relevant information. These databases include information about the material, such as appropriate orientation and specifications for selection of suitable materials (and rejection of others) based on wood and grain characteristics.
This approach recognizes, but does not completely solve the problems we face in applying CAD and CNC to the manufacture of products made from wood.
Continuous process operations provide a useful initial model for optimal production efficiency, but an inadequate model for manufacturing, because they do not have to respond to the full range of external conditions one encounters in custom-building or batch processes. Inputs to the continuous model are few: the raw materials, the desired output characteristics (if variable) and desired output rate (if variable). That's it. Chemicals in, ice-cream out.
In the manufacturing arena there are thousands of variables, all the way down to the unique mechanical characteristics of individual pieces of wood, which demand that decisions be made on the fly. Addressing all these variables in a computer program is a major problem. In fact, developing databases of material characteristics is one of the key tasks that must be accomplished before many of the wood-based machining operations can be successfully computerized.
The amount of material removed in any machining operation is a function of the cutting path and the characteristics of the material (springback, chatter, and other complex interactions within the machining process). These characteristics are all "predictable" and therefore "programmable". "Predictable" means that a skilled machine operator knows how to address them. "Programmable" means that this information can be assembled in a materials database which feeds a simulation model, or modifies the control language and redirects the router's path to compensate for the characteristics of the material. Until these characteristics have been catalogued and defined, and their characteristics made accessible to the control program, non-homogeneous (anisotropic) materials will present significant programming and router-control problems.
Right now, there is a big step between design and manufacture, called proofing. This is common in other industries that must address complex materials and multiple variables, especially in graphics. Computers revolutionized publishing during the 1980's. By the end of the decade people ran entire magazine operations using desktop computers, including the page layout and the color separations used for plate making.
Files are transmitted by modem (or carried by courier) to a graphics output service where they are used to drive very expensive high resolution photographic printers (linotronic plate makers) which produce continuous sheets of plastic negative material, in black and white, at a dot density 16 times as high as a laser printer. These are enormous files, and the cost of running them is high. A color brochure will often use 5 plates (C,Y,M,K,G). Plates are made from these negatives by a direct photoreversal process, and then each is inked and applied to the paper in sequence. Even though it is possible to do color separation on a PC and go straight to the press, you and your customer are often surprised by the results when it comes off the printing press.
For this reason, products are normally "proofed" (prepress) and the values revised in an iterative process before the final plates are developed.
The same applies to CAD files sent to an output service. Unless you are working in flakeboard or high density plywood, the chances that your file will produce the parts you want on the first try are not too good. Generally, the tools you are using to create your CAD files (Generic CAD, DesignCAD, CADKey, AutoCAD, MasterCAM, SmartCAM, etc.) were designed to create models in paper-space and if they have links to any physical-properties databases, they are to databases related to metal or plastic, not wood.
This presents a number of significant problems. To make matters worse, the tool-control software may interpret drawing information that is merely an artifact of your drawing methods as an instruction, and respond in ways you do not intend.
The link between CAD drawing and CNC cutting is vastly more complex in wood than it is in metal or plastic. Wood is a non-uniform and highly directional material, and the fibers in the wood interact with the path, feed-rate and speed of the cutter in ways that are both difficult to predict and potentially intolerable. Characterization of these interactions requires mathematical analysis beyond the capabilities of any software now available and is far beyond the skills of the CAD operator.
Rather than attempt to calculate all the possible cutting options (speed, direction, orientation, etc.), the CNC operator will make an educated guess, based on past experience, and eliminate the "merely possible" from the "probably appropriate" solutions. Then the operator will close on the solution from among the remaining options through real testing.
The results of these tests are critical and hard-won information that must find its way into the company's long-term memory. Ideally this information would automatically become part of the CAD file and provide direction to future operators. Current software does not make this easy, although ADE, the latest database add-in to AutoCAD, may make it reasonable to expect this from CNC add-ins in the near future.
For many years the difficulty of available programming languages and the perpetuation of the "cult of information" has split workers from programmers and managers, and put many design and production decisions into the hands of people who lack the information needed to make these decisions.
It is essential that the enabler tools of the future allow direct input that utilizes the experience and wisdom of the operators and craftsmen. It is beyond the capabilities of today's hardware and software to "know" that the fit and finish of a machined part, or the strength and overall performance of a machined joint, will be sensitive to subtle variations in the orientation of the wood's grain.
As demonstrated throughout this section, the cost of migrating your products to CNC is going to be high. The most difficult decision you will have to make is whether to plan to use CNC as the means of production (to make finished parts or merchandise) or to use CNC as the means to create production tools and fixtures that can be used with other manufacturing technologies. There is no simple way to automate this decision. Part of the reason that the MTC is based on CNC machinery is because the MTC can allow this decision to be approached gradually and made based on experience in your own business, rather than on the promises of salesmen. Lots of companies will buy CNC equipment in the next few years. Many of them will not survive the purchase. Part of your job as a business manager, is to position yourself to be an ally to the people who do buy into the hardware, by being ready to buy time on their equipment.
In the "world economy", and especially in rapidly changing markets, production labor costs are often a relatively small component of the final cost of a product. Recapture of the cost of design, engineering, financing, marketing and distribution all compete with the actual manufacturing costs for the sales revenues. The following shows you the development cost of several very diverse products:
|Rollerblade Bravoblade||Hewlett-Packard Deskjet 500||Boeing 777 Airplane|
|Annual production||100,000 pairs||1.5 million||50units|
|Total Sales||300,000 pairs||4.5 million units||1,500 units|
|Sales lifetime||3 years||3 years||30 years|
|Sale price*||$200* / $89||$365*/ $168||$130,000,000|
|Margin after development||$22,500,000||$25,000,000||$3,000,000,000|
Adapted from Ulrich & Eppinger, Product Design & Development, 1995,
Data derived from publicly available information
& company sources.
|Hewlett-Packard Deskjet 500||Boeing 777 Airplane|
|Annual production volume||1.5 million||50units|
|Sale price*||$365*/ $168||$130,000,000|
These products represent an interesting spectrum. The payoff analysis on two out of the three products are conditioned by factors that do not appear on these sheets. The economics of Boeing's commercial aircraft division are probably too complex (and too political) to delve into here, but the development strategy behind the printer is worth exploring.
In the design of the DeskJet printer, two important factors were identified: 1) the enormous volume of high-value consumables, and 2) the re-applicability of the basic technology. High cost consumables are the basis of the medical appliance and apparatus industry. In the early days of InkJet technology, there were two strategies: 1) ink conveyed from a reservoir to the printhead (Canon's model), and 2) the integral reservoir/printhead (HP's model). You will note that in today's market, all inkjet manufacturers use integral reservoir print-heads. This is because HP, Canon and others sell disposable ink cartridges for these printers, which contain the printhead as well as the ink reservoir. These cartridges retail for approximately $22.00 and are sized so they must be replaced regularly. During the life of the DeskJet, the cost of consumables will exceed the initial cost of the printer. And because the production and distribution costs on the consumables are low, and profit is far higher than the profit on the printer, the DeskJet (or Canon's BubbleJet) printer is thus a "Trojan horse" for a platform product: namely, ink-jet cartridges.
Following this logic, HP has produced a wide range of products based on its initial investment in inkjet technology.
The ultimate integration of CAD & CAM technology is taking place in the emerging field of flexible manufacturing systems (FMS).
FMS is a building-block system composed of programmable robotic manufacturing cells, each capable of a fairly narrow range of activities. By arranging these cells in different ways and arming each with the appropriate cutters or assembly fixtures, a manufacturer can accommodate a wide variety of functions and automate a wide range of activities, depending on the sequence and operations programmed.
You need to understand that all of the tools discussed in this section exist now and are in use locally already. The Boeing 777 was entirely designed in a virtual factory based on a simulation of the FMS based factory that was eventually created to produce the planes. Such a system is readily available and priced at under $100,000 per installation (May 1995 est. $70,000 not including all the post-processors). This is not Star Wars. This is now, and these tools will have real applicability to your operation in the near future when the price of time in the driver's seat drops within reach. The simulation software demonstrated in this course's classroom sessions costs under $17,000 and runs on mid-priced PC's (fast 486 DX processor w/ 12 - 16 MB RAM).
There is an enormous learning gulf that must be bridged before you will be ready to utilize them, but your competition, if you look beyond your traditional local competition and look instead at the larger world market, is already working toward it.
Before you can use them effectively, you need to develop the information and the information management skills discussed elsewhere in this manual, and then develop a "manufacturing community" based on cooperative relationships with other manufacturers in your community. The tools described in this section are too expensive to sit idle, waiting to be used. They need to be fed a constant stream of products. This requires sales and generating sales requires products fine-tuned to the needs of the market.
In some operations the application of flexible manufacturing cells all but eliminates the need for specialized tools and tooling and dramatically reduces the cost to produce revisions. In most cases, it can also reduce the cost of getting a product into production in the first place. In many ways, it represents a major shift from the long-held paradigm of attempting to increase operational efficiency by making manufacturing operations work like continuous production plants (paper mills, canneries) where specialized single application machinery is arranged to provide a continuous flow of material converging at the point of packaging. Flexible manufacturing is in many ways a return to the model you began with when you opened a craft-based business, based on general purpose tools like drills, table saws and router tables.
The FMS concept dovetails so tightly with the development of the Virtual Factory in such a "chicken and egg" manner that it is difficult to tell which side of the process is actually driving.
The virtual factory is composed of a backbone of powerful "high end" CAD software which goes far beyond conventional drafting tools. The ability to link entities into "assemblies" allows creation of fully functional 3-D representations of virtual objects (e.g., shafts that rotate, "attached" to crank arms that actually link the lateral oscillatory motion of the piston to the rotational motion of the crankshaft, gears that transfer rotation from one element to another, cams that actuate valves, springs that compress and relax, etc.). You get the picture: instant animations.
So what? Why is this important?
This is a fundamental change from previous visualization tools and most animation tools . Most of the surface modeling or "rendering" ability of CAD systems (in particular, ray-tracing to allow modeling the effects of multiple light sources on the object being rendered) was developed to allow "photorealistic rendering". This provided a very important selling tool: 3-D renderings of objects that looked "real" and which could be rotated and viewed from multiple angles. This allowed designers to "sell" complex ideas to managers and directors who couldn't necessarily read mechanical drawings, using a medium familiar to all: a monitor that looked like a television. But as valuable as this function is, rendering tools are slow and expensive to run and don't give the designer (whose professional career is based on the ability to visualize and render) much power. For anyone with traditional representational skills, using these tools to communicate ideas quickly is a major obstacle.
The next-generation tools serve another purpose entirely. They were not intended to create quick sketches or rough ideas, or to do anything "quickly". They provide a visual metaphor for a centralized project information system and activity clearinghouse. This allows design teams working independently and/or in different workgroups (or even different companies) using a variety of powerful, but "incompatible" CAD packages, to merge data files.
These files can hold and integrate the coordinate geometry of parts and subassemblies composed of thousands (or even tens of thousands) of discrete parts. When merged in a system like Silma, they become a working representation of the finished product for review of the fit and interplay between subassemblies. Not only can the completed product be rotated and observed from different angles. Parts and subassemblies can be moved relative to one another and linked together, allowing unexpected collisions to be detected. This allows the team to identify assembly or service problems that might otherwise have gotten into production before they were discovered.
Just as important as the underlying graphics and data management engines powering these systems are the file translation tools which allow the incompatible methods of geometric description used in the popular high-end CAD systems (Computervison, Catia, etc.) to coexist. The output of the simulation package is an integrated representation of the product which can drive the next stage of visualization: virtual production.
FMS post processors
Just as there is no real standard for characterizing lines, points or layers in CAD systems, there is no standard control language for FMS cells. Emerging technology continuously outgrows the language on which it is based. However, since the manufacturers of FMS cells must assume that their cells will be controlled by a wide range of existing computer systems and software, if they are to sell them they must provide the specifications of the interface and the commands in a form that allows software engineers to connect CAD systems to them. These FMS cell command languages provide the basis for creation of the Virtual Factory.
The CAD model developed from all the CAD part and subassembly files is disassembled and a manufacturing strategy for each part is developed. This strategy is based on the application of the particular capabilities of the available FMS cells arranged in the appropriate sequence.
The cell manufacturer provides more than the cell's command language. The cell's physical geometry and range of motion are also specified. This allows the manufacturing engineer to design a way to reproducibly and accurately locate the raw material, part or subassembly in the cell, and to program the process or processes that will take place there. The cell, the subassembly and the activities taking place there can all be "rendered" and animated by the CAD system.
If the cell removes material, the geometric characterization of the part will reflect the removal after the operation is performed. In this way, the entire manufacturing process can be modeled inside the computer. And the parts created through the application of the virtual manufacturing cells can be measured and fitted together into subassemblies. Additionally, tolerances can then be tested. The technology has already developed to the point where the manufacturers of FMS equipment recognize the necessity of developing CAD files to allow their products to be integrated into the virtual factory. Otherwise they will be left behind. This has led many of the important manufacturers of robotic tooling to develop "virtual" versions of their tools, so the latest measuring instruments are available as well. This allows virtual parts to be "placed" in the measuring cells located on the virtual production line and the dimensions and tolerances of virtual parts to be precisely measured.
How far away is this technology?
It is literally here now. If you were ready for it today, you would be able to use it this week. A major portion of this course was devoted to exposing you to the kinds of skills you will need to develop to get ready to access these tools. The next generation of the MTC concept, which will arrive in the very near future, will be able to offer both access to the virtual factory and the link to direct manufacturing capability that FMS offers.
FMS operates at two distinct levels: 1) at the prototype stage, and 2) at the production stage.
At the prototype stage
At the prototype stage, FMS tools and especially fast-prototyping machines (see SL and SLS immediately below) allow one-off (one-at-a-time fabrication) plastic or metal parts to be created for limited production. This technology was discussed briefly in the previous section. Functional products can be made using one-off plastic and metal parts allowing end-user evaluation and even field-testing of alternate implementations which closely resemble finished products early in the prototype stage of product development. Parts virtually interchangeable with production parts can be developed without first creating metal molds to inject the finished parts into.
There are a variety of tools available which allow this kind of work, and in spite of the gulf between what they do and what familiar products do, they are not far removed from the laser and water-jet cutting tools your competitors are already using to cut and engrave wooden parts for cabinets and desk accessories.
The original direct fabrication technology was known as Stereo-Lithography (SL). In the SL system a laser beam, controlled by a computer driven by a 3-D CAD file, actually draws a line through a thermosetting resin solution, providing the energy to initiate polymerization and forming a solid plastic object. The laser advances along the product's sections slowly, and as it moves it deposits a thin layer of polymerized plastic in its wake. In a manner very similar to the way a potter builds a pot or bowl from a coil of clay rods, the laser builds up the edge of the object to create a 3 dimensional solid structure inside a beaker of plastic resin. This system is limited by the range of suitable polymers and the size of the vat in which the part is "grown", but it is capable of phenomenal detail. It is best suited to small, complex, high value plastic parts and is inherently expensive and messy. The first commercial application I saw for this technology was a system for automatically creating patterns for casting gold crowns in a dental lab.
A variation on this process, known as Selective Laser Sintering (SLS), uses a laser beam, also controlled from a CAD file, to fuse thermoplastic resin or wax powder along the edge of the prototype. This offers numerous advantages over the thermosetting solution, including the ability to directly produce metal parts from the prototypes developed, using the lost wax process. This system is being used in the development of prototypes of consumer products like drills and weed-eaters.
FMS tools are even more useful at the production level than they are in the prototype stage, allowing minor design revisions which affect many parts simultaneously to be made "on the fly" without requiring any retooling at all.
FMS tools are currently very expensive. Entry level CNC router technology applicable to wood products manufacturing probably starts at $50,000 at this point.
This is a very different approach to technology than the one which led to the development of automated power tool factories in which the lights are kept off to save electricity and only come on in the event of an error triggering an alarm and stopping the production line (in which case a path to the offending robotic workstation is illuminated). The latter sounds like a great way to make people obsolete; just a roomful of machines making more machines, with someone at command central eating Fritos and reading comic books, waiting for something to break. But it does not have to turn out that way.
Development of computer network (data highway) architecture connecting a number of flexible (programmable) manufacturing cells.
Development of reusable programming libraries to control activities of individual manufacturing cells stationed along this data highway.
Development of knowledge-based programs to route materials through production processes and to adjust production flow based on "long term memory" (evaluation of previous production experience) and "short term memory" (evaluation of feedback from operating Cells).
Development of simulation software and libraries of cell control language interpreters. These tools allow designers to program virtual manufacturing cells and evaluate results of planned operations prior to actual commitment of resources through cell purchase, cell / production line setup, or the consumption of raw materials.
Based on the discussions at WoodNet's pilot training course, a first cut at prioritization for many of the operations involved in computerizing your manufacturing business looks like this.