1.3. THE TEN MOST COMMON MISTAKES 3
We note that the minimal requisite skill set does not contain an in-depth, rigorous, math-
ematical treatment of the theory underlying FEM. Such rigor, while necessary to program al-
gorithms or as a prerequisite for graduate studies, is not essential to operate and perform finite
element simulations and correctly interpret their results. For practical applications of FEA, what
is imperative is the ability to distinguish between good and bad methods for interfacing with the
tool.
Note To e Instructor
A treatment of the background necessary to use the finite element method effectively is given by
Papadopoulos et al. [2011]. Here we argue that a top-down, theory-first emphasis employed in many cur-
ricula may not be as necessary as has been thought. We believe that teaching the underlying mechanics can
be enhanced by introducing the finite element method as early as an Introduction to Engineering course in
the freshman year. We also feel that hand calculations in Statics and Mechanics of Materials can be reinter-
preted and made more appealing by emphasizing them as steps used to validate and benchmark numerical
simulations. Finally, in an upper division course in finite element theory, one may undertake a deeper learn-
ing of how to perform an informed computational analysis under the tutelage, guidance, and support of a
seasoned, experienced practitioner.
1.3 THE TEN MOST COMMON MISTAKES
Computational models are easily
misused…unintentionally or intentionally.
Boris Jeremić
University of California Davis
In accordance with our proposed minimal requisite skill set, we now present a useful list of com-
monly committed errors in FEA practice. While the advanced user will likely recognize many of
these errors (hopefully through direct experience!), the novice who has little or no FEA experi-
ence might not fully appreciate their meaning at this point. Nevertheless, they serve as a good
preview of issues that will arise, and as a reference to which the novice may return as he or she
gains more experience.
Recently, Chalice Engineering, LLC [2009] compiled an assessment of mistakes most
commonly made in performing finite element analysis in industrial practice. After 10 years of col-
lecting anecdotal evidence in both teaching undergraduates and advising capstone design projects,
we found this list to be nearly inclusive of the most common errors encountered by undergrad-
uate students in their introductory finite element method course. e list published by Chalice
Engineering is reproduced here verbatim.
4 1. GUILTY UNTIL PROVEN INNOCENT
1. Doing analysis for the sake of it: Not being aware of the end requirements of a finite ele-
ment analysis—not all benefits of analysis are quantifiable but an analysis specification
is important and all practitioners should be aware of it.
2. Lack of verification: Not having adequate verification information to bridge the gap be-
tween benchmarking and one’s own finite element analysis strategy. Test data some-
times exists but has been forgotten. Consider the cost of tests to verify what the analysis
team produces, compared with the potential cost of believing the results when they are
wrong.
3. Wrong elements: Using an inefficient finite element type or model, e. g., a 3D model
when a 2D model would do, or unreliable linear triangular or tetrahedra elements.
4. Bad post-processing: Not post-processing results correctly (especially stress) or consis-
tently. Not checking unaveraged stresses.
5. Assuming conservatism: Because one particular finite element analysis is known to be
conservative, a different analysis of a similar structure under different conditions may
not be so.
6. Attempting to predict contact stresses without modeling contact: is might give
sensible-looking results, but is seldom meaningful.
7. Not standardising finite element analysis procedures: is has been a frequent cause of
repeated or lost work. Any finite element analysis team should have a documented stan-
dard modeling procedure for typical analyses encountered within the organisation, and
analysts should follow it wherever possible. Non-standard analyses should be derived
from the standard procedures where possible.
8. Inadequate archiving: Another frequent cause of lost work. Teams should have a master
model store and documented instructions about what and how to archive. Again, this
is a quality related issue. For any kind of analysis data, normal backup procedures are
not sufficient—attention needs to be paid to what information and file types are to be
archived in order to allow projects to be retraced, but without using excessive disk space.
9. Ignoring geometry or boundary condition approximations: Try to understand how in-
appropriate restraint conditions in static or dynamic analyses can affect results.
10. Ignoring errors associated with the mesh: Sometimes these can cancel out errors asso-
ciated with mistake 9, which can confuse the user into thinking that the model is more
accurate than it is. A convergence test will help.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset