Index
Note: Page numbers followed by “f” and “t” refers to figures and tables respectively.
A
Access, gaining
organized crime groups,
131
state-sponsored/terrorist groups,
130–131
trouble causers, hobbyists, and lone gunmen,
131–132
to underground bunker data center,
121–122
Active e-mail reconnaissance,
214
Active information gathering,
257–264
Active spidering with OWASP Zed Attack Proxy,
160–163
wireless access points,
306
Arms and legs, in body language,
62
Associates engagement,
142
Authorization e-mail,
250
Awareness and training program,
340–345
awareness without training,
342–343
model for effective training,
345–358
weak training programs, taking advantage of,
344–345
wrong management model, choosing,
343–344
B
Belkin N300 Go access point,
306
Buckingham Palace breach,
14
Business security chain,
19
customer service mentality,
29–31
data classifications,
25–27
lack of awareness and training,
32
poor management example,
31
problem with privileges,
23–25
secure data with vulnerable users,
22–23
weakest link, personnel as,
20–37
weak security policies,
32–36
Business, gaining access to,
127
C
Call
Chain of authentication,
48–50
Cisco Discovery Protocol (CDP),
274
Client application patching,
221
Cloned web sites, using
Compromising internal systems,
306
Computer Fraud and Abuse Act (CFAA),
104–105
Computer Misuse Act (CMA) 1990 (UK),
100–101
Conversation, focusing,
77
partners, clients, vendors,
156
photos of employees and business locations,
157–158
reverse image search engines,
179–180
Craft spear phishing attacks,
127
Craftphishing attacks,
127
Credential Harvester Attack Method,
231f
Credentials and e-mail access,
249–251
Credibility, gaining,
50–51
Customer service mentality,
29–31
D
Data classifications,
25–27
OS folder structures and text editor,
313–314
Defending against phishing attacks,
221
Defining, social engineering,
2–5
Delivery of the report,
325
Denial-of-service (DoS) attacks,
346–347
Departmental awareness and training,
350
Departmental requirements,
348
Dirty Rotten Scoundrels,
Disgruntled employees,
81
Distributed denial-of-service (DDoS) attacks,
346–347
Dynamic Trunking Protocol (DTP),
274
E
E-mail attack vector,
205
active e-mail reconnaissance,
214
cloned web sites to harvest credentials,
230–232
defending against phishing attacks,
221
nondelivery reports (NDRs),
214–215
real-world phishing examples,
210
recruitment consultant,
220
setting up your own attack,
222
spear phishing attack vector,
224–227
spear phishing versus trawling,
208
spoofed e-mails versus fake domain names,
223–224
weaponizing the scenario,
218
work experience placements,
218
E-mail phishing scams,
45
Employee names on corporate website,
156
Employee Social Engineering Security Awareness and Education Programming,
328
Exif (exchangeable image file format) data,
174
Extended phishing attacks,
78–80
External assessments,
362
F
Fake social media profiles,
74–76
Financial Times breach,
14–15
Framing information,
55–56
“Free Internet Access,”,
263
G
General Data Protection Regulation (GSDR),
91
Get out of jail free cards,
110
GoogleMaps Streetview,
260
H
Hardened policies and procedures,
327–328
resetting user passwords, procedure for,
336–338
industry information security and cyber security standards,
331–334
social engineering defense,
330–331
Host based intrusion preventions (HIPS),
221
Huawei E160 USB stick,
287
Human resource security, domain 8,
93–94
I
Identity of the caller, confirming,
336–337
Impersonating staff members,
247
Individual awareness and training,
350
Individual impersonation account,
75
Industry information security,
331–334
Information Assurance (IA) process,
120
Information Commissioners Office,
124
Information disclosure policies,
70
Information elicitation,
76–78
Information Security Management System (ISMS) document set,
334
Information Security program,
331
Internal social engineering assessments,
358,
361
designing the internal test,
365–369
testing the infrastructure,
366–368
testing the people and processes they follow,
368–369
Internet service provider (ISP),
41
Intrusion Prevention Systems (IPS),
308
ISO/IEC (International Organization for Standardization/International Electrotechnical Commission),
89
ISO/IEC 27000 information security series,
93–95
human resource security, domain 8,
93–94
physical and environmental security, domain 9,
94–95
ISO/IEC 27001/2:2013,
332
J
K
Key information and access, obtaining,
249
L
Lack of awareness and training,
32
Legislative considerations,
99–105
Computer Fraud and Abuse Act (CFAA),
104–105
Computer Misuse Act (CMA) 1990 (UK),
100–101
Human Rights Act 1998 (UK),
103–104
Police and Justice Act 2006 (UK),
101
Regulation of Investigatory Powers Act 2000 (UK),
101–102
Leveraging authority,
46–47
Long-term attack strategies,
71–86
expanding on initial reconnaissance,
72–74
extended phishing attacks,
78–80
fake social media profiles,
74–76
gaining inside help,
81–82
information elicitation,
76–78
long-term surveillance,
83–86
targeting partner companies,
82–83
working at target company,
82
Long-term surveillance,
83–86
Low rent e-mail scam,
213f
M
MAC Address Bypass (MAB),
274
Malicious hyperlink,
229f
Malicious USB sticks,
308
Man-In-The-Middle (MITM) attack,
103
Manipulation, techniques of,
39
chain of authentication,
48–50
credibility, gaining,
50–51
framing information,
55–56
from innocuous to sensitive,
51–52
leveraging authority,
46–47
personality types and models,
58–60
pressure and solution,
45–46
priming and loading,
52–53
reverse social engineering,
48
selective attention,
57–58
Matchstick Men,
Microsoft X-Box breach,
15–16
N
NIST SP800-30 standard,
127,
128
Nondelivery reports (NDRs),
214–215
O
Open-source intelligence, leveraging,
154–155
partners, clients, vendors,
156
photos of employees and business locations,
157–158
reverse image search engines,
179–180
Open-source reconnaissance,
140
routing issues and how to overcome,
302–304
Organization impersonation account,
75
Organized crime groups,
131
OS folder structures and text editor,
313–314
Outbound content filtering,
221–222
Oz, Frank,
P
Partner companies, targeting,
82–83
Password
Patterns of behavior,
141
PCI DSS (Payment Cards Industry Data Security Standard),
89
Penetration Testing Execution Standard (PTES),
105,
371
People and processes, testing of,
365
Personality types and models,
58–60
Personnel, as weakest link,
20–37
Phishing Email Attack,
313,
324
awareness and training,
70
Phone survey policies,
70
Physical and environmental security, domain 9,
94–95
active information gathering,
257–264
e-mail and telephone attacks, building on,
256–257
Physical covert engagement,
142
scenario specific outcomes,
146–148
Plausible situation,
40,
41
Police and Justice Act 2006 (UK),
101
Poor management example,
31
Pre-engagement interaction,
106,
372
Problem with privileges,
23–25
Q
Questions
R
Real-world phishing examples,
210
Recruitment consultant,
220
Regulation of Investigatory Powers Act (RIPA),
102
Regulation of Investigatory Powers Act 2000 (UK),
101–102
installing the operating system,
280–283
screens, wireless, and other hardware,
292–299
Compromising internal systems,
306
Malicious USB sticks,
308
Remote Telephone Attack,
313,
324
Requests, acceptable sources of,
336
Reverse image search engines,
179–180
Reverse social engineering,
48
Robinson, Phil Alden,
RSA 2-factor authentication,
50–51
S
Scenario specific outcomes,
146–148
Secure data with vulnerable users,
22–23
Selective attention,
57–58
Setting up your own attack,
222
Short-term attack strategies,
66–71
common short game scenarios,
69–71
targeting the right areas,
66–68
using the allotted time effectively,
68–69
Social engineering cheat sheet,
374f,
375
Social Engineering Defensive Strategy,
327–328
Social engineering engagement,
87
compliance and security standards,
90–91
ISO/IEC 27000 information security series,
93–95
payment cards industry data security standard,
91–93
dealing with unrealistic time frames,
96–97
dealing with unrealistic time scales,
96
less mission impossible,
95–96
more mission improbable,
95–96
taking one for the team,
97
client, challenges for,
98–99
getting the right people,
98–99
legislative considerations,
99–105
pre-engagement interactions,
106
operational considerations,
95
team members and skill sets,
114–117
pre-engagement interaction,
106,
372
Social engineering methodology,
321–322
Social engineering overview,
321
Social engineering toolkit (SET),
205
Social security numbers (SSNs),
15–16
Softley, Iain,
Spam and antivirus products,
221
Spoofed e-mails versus fake domain names,
223–224
Staff members, impersonating,
247
State-sponsored/terrorist groups,
130–131
T
Tailgating/piggybacking awareness and training,
71
Targeted scenarios, creating,
135–136
scenario specific outcomes,
146–148
open-source reconnaissance,
140
Technology, supporting attack with,
271
cable and live port testers,
273
installing the operating system,
280–283
screens, wireless, and other hardware,
292–299
Telephone attack vector,
235
credentials and e-mail access,
249–251
e-mail attack, building on,
240–241
obtaining key information and access,
249
out-of-office response,
241
staff members, impersonating,
247
Telephone attacks, building on,
256–257
Third-party help desk,
349
gaining access to business,
129–133
organized crime groups,
131
state-sponsored/terrorist groups,
130–131
trouble causers, hobbyists, and lone gunmen,
131–132
consultant led threat modeling,
122
gaining access to underground bunker data center,
121–122
plugging into information assurance and risk management processes,
126–129
Tiger Team,
Traditional learning,
314
departmental requirements,
348
Trojan horse-style attack,
329
U
Underground bunker data center, gaining access to,
121–122
Malicious USB sticks,
308
identity of the caller, confirming,
336–337
protecting passwords,
337
requests, acceptable sources of,
336
V
Validating callers over the phone,
355
Visitor/contractor booking procedures,
70
W
Weak security policies,
32–36
Weaponizing the scenario,
218
Work experience placements,
218
Working at target company,
82
social engineering methodology,
321–322
social engineering overview,
321
Z
Zero-day exploitation,
15