What (Really) is Generative Design?


There is a whole lot of talk now about “generative design”. The hype around additive manufacturing becoming a viable means of production has created a major shift in the marketing and development of design and simulation software. “Hype” always means there is some blurriness in facts, and terminology is used freely and sometimes wrong, whether intentional or not. So, without the bias of promoting any particular software, I thought I would try to explain some of this terminology and their processes in a general and understandable way.

“Generative design” is a very, very broad term and it is commonly confused or blended with the term “parametric design”. To easily distinguish the two, we can think of “parametric” as an equation and “generative” as an algorithm or iterative process.

In parametric design you have set parameters, or “variables”, that are used in an equation and you have only one solution, or “right answer” for any set of parameters. With parametric design you can manually change these variables until you find the design you want or the “goal”. This is a concept most designers are familiar with who use parametric tools like Solidworks or ProE. The kinds of parameters and functionality becomes much greater with tools like Grasshopper or Dynamo.


Generative very broadly means something that is generated. This could be anything... Really. This could be an algorithm that generates a random number. But a design is typically not random. Usually you want a design to fit within some parameters or criteria. Let’s say you have a parametric design and a program that generates random parameters. This will generate millions of solutions unless it knows what you are looking for. So, let’s say you have another program that allows you to tell it what designs you like and don’t like. Now it would be able to narrow down it’s solutions based on those statistics (some would call this AI). This could be a very time consuming process because the program doesn’t know exactly what you like about any particular design. Instead, the fastest way would be if you could give it some actual criteria to look for, or a “goal”. This kind of process is called an “Evolutionary Solver” or “Genetic Algorithm”.


Most of what you see today is the result of some kind of “design optimization” using a genetic algorithm. The concept behind genetic algorithms is fairly understandable because, well… they work like evolution. David Rutten, the developer of Grasshopper, describes the process very well in his blog “I Eat Bugs For Breakfast” . The idea is: you have a set number of parameters or “genes” and you have a “fitness criteria”. The algorithm generates a set number of solutions per “generation”, then finds the solutions closest to the fitness criteria. Then the most “fit” will “mate” with each other to make the next generation. The process can go on forever if you don’t tell it to stop, so you give it a range that you say is “good enough” or a tolerance of variation between generations. Genetic algorithms can use any number of parameters, but the more parameters, the longer the process will take. Multi-objective solvers can also have more than one “goal” or fitness criteria. This is helpful if say, you want something to be strong, but also light-weight.


That brings us to a process that is being commonly dubbed as “generative design”... Topology optimization. Topology optimization works like a genetic algorithm. It divides a design (volume) into many discrete elements, then uses FEM to simulate those elements. The variables are the density of each element and the goals are: minimize density and maximize stiffness. By measuring the strain energy of each element, it can assign a density to that element, with each iteration reducing the density as much as it can while increasing density in elements with more strain. The result is a material distribution with the least amount of density and maximum stiffness. Finally there is a threshold value of density for elements that are considered to be removed (~less than 10% density). The elements that are left is the optimal topology (material distribution) for the given simulation, with the given volume and number of elements.


Most FEA software packages include some kind of topology optimization, and have for a very long time. But do to the popularity there are many more options now. If you can’t afford Ansys or Abaqus, you can try the new simulation tools available in Fusion360. There is also a cloud based tool from Frustrum called Generate. If you use Rhino and Grasshopper there are some free tools: Millipede by Sawapan and Topos, which uses GPU for accelerated computation (used in the animation above and cover image).

Topology optimization is a great tool and it has gained so much popularity because of 3D printing, but there is generally only one purpose for it: optimizing the shape of a rigid structure to reduce weight. That’s it really. Although that is the primary job of a lot of engineers. There are also some drawbacks to most solvers. Most topology optimization solvers (as of writing this) can only perform the simulation with isotropic materials (uniform properties in every direction) like metals. This is because the algorithm gets much more complicated if it needs to consider a material with different mechanical properties in different directions. So it would not necessarily be accurate with 3d printing processes that produce a very orthotropic material (from layers), or for instance wood. This is being improved as we speak and there will be solvers specifically for 3d printed materials that consider layer strength, build orientation, etc.


Another process that is commonly being dubbed “generative design” are lattice structures. (Almost) all software tools for creating lattice structures are completely parametric, or at least the process begins parametric, meaning the user is making informed choices on the design of the lattice. These tools are not generative unless used with some kind of optimization or genetic algorithm (which you can!). The difficult part when designing a lattice structure is: there is no way of knowing how a lattice is going to perform until you set your parameters, create the lattice and try to simulate the model or do physical testing. The difficult part for software developers is: There are so many applications for lattice structures. What are you optimizing for? Lattice structures in additive manufacturing are now being used for structural light-weighting, energy absorption, heat transfer, filtration, medical implants, etc.. As of now, there is (usually) nothing automated about the process and it requires expertise to really understand the benefits and performance of a lattice structure.

But let’s say we focused on one application: structural light-weighting. This is the application most software companies are focusing on because the big dollars in additive manufacturing are in aerospace and automotive, where weight and performance are important and expensive. Now we have our design criteria: maximize stiffness and minimize weight. There are a few variables at play: The type of unit cell (the repeated pattern), size of the cells, orientation of the cells, and thickness (or density) of cells. This is because lattice structures are (almost) always anisotropic. Meaning they react differently to forces coming from different directions. Their stiffness depends the type of structure and their orientation. You could absolutely (in theory) run a genetic algorithm with those variables and our two fitness criteria and you would have a “generative design”. It could take hours, days or weeks to solve depending on the complexity of your design and number of variables. Here is what it looks like doing a simple simulation using Crystallon , Millipede and Galapagos. This is using only a few variations of unit cell and a few cell sizes in one orientation with one goal of minimizing deflection. This itself can take hours to finish.


Another option is using beam or shell optimization. This is a process almost identical to topology optimization except instead of solid elements with density, we have beam or shell elements with a thickness. A common method for this is called BESO (Bidirectional Evolutionary Structural Optimization) used in popular tools like Karamba. This tends to be the best method, but you still need to design the structure first, which may not be the optimal design in the first place.


A common problem I see today is people wanting to use density values from a topology optimization and applying them to the thickness of a lattice structure. The problem with this is, like I said before, the topology optimization is (usually) assuming a completely isotropic material. Lattice structures are (almost) always very anisotropic. So depending on the unit cell and its orientation, it will have very different properties, no matter how thick it is. Here is a quick example to compare a topology optimization (assuming an isotropic material) with a lattice structure using beam optimization. As you can see, depending on the type of unit cell, the beam optimization can be quite different.


The only way to do this properly is to use a technique called homogenization. The idea is to simulate a single unit cell under different loads to get a model of how it behaves. Then that material model is used in the topology optimization solver for each element. Again this requires a solver that can account for orthotropic materials. As of now there are only a few companies I know of that are in the process of implementing this. The technique is ideal for the “generative design” of lattice structures for structural light-weighting. It could also be possible to apply the same technique to other applications and other “goals” such as energy absorption, or with a multi-physics solver for things like heat transfer.


Hopefully this information will better inform you on what (really) is generative design and give you ideas to consider when using these software tools.