Index

Note: Page numbers followed by “f” and “t” refers to figures and tables respectively.

A

Abagnale, Frank, 11–12
Access, gaining
to business, 127
organized crime groups, 131
state-sponsored/terrorist groups, 130–131
trouble causers, hobbyists, and lone gunmen, 131–132
to underground bunker data center, 121–122
Active e-mail reconnaissance, 214
Active information gathering, 257–264
dumpster diving, 258–259
photography, 260–261
public access areas, 262–263
reception area, 261–262
rogue access points, 263–264
shoulder surfing, 259–260
Active spidering with OWASP Zed Attack Proxy, 160–163
Adafruit Pi Plate, 293–296
Advance fee fraud, 211
Alternative dropboxes, 304–306
3G and IP KVMs, 304–305
routers, 305
wireless access points, 306
American Express, 210–211
Apple ID scam, 211–212
Dr. Atanasoff Gavin, 211
Anger/indignation, 46
Antitheft signage, 54
Apple ID scam, 211–212, 211–212, 213f
Apple store, 237
Arms and legs, in body language, 62
Asset, 122
access to, 123
importance of, 123
protection, 124
Associates engagement, 142
Audio recording devices, 307–308
Authorization e-mail, 250
Availability, 27–29
Awareness and training program, 340–345
awareness without training, 342–343
model for effective training, 345–358
development, 349–356
implementation, 356–357
individual departments, 347–349
maintenance, 357–358
planning and design, 346–347
role of management, 346
need for, 341–342
weak training programs, taking advantage of, 344–345
wrong management model, choosing, 343–344

B

BackTrack, 222
Badges and lanyards, 265–266
Badir, Muzher, 12
Badir, Ramy, 12
Badir, Shadde, 12
Bait account, 75
Baiting, 44–45
Bargh, John, 52–53
BBC Weather, 134
Belkin N300 Go access point, 306
Body language, 60–62
Bourdin, Frédéric, 9–10
Brute forcing, 198
Buckingham Palace breach, 14
Burner, 114–115
Business security chain, 19
availability, 27–29
customer service mentality, 29–31
data classifications, 25–27
functionality, 27–29
lack of awareness and training, 32
poor management example, 31
problem with privileges, 23–25
secure data with vulnerable users, 22–23
security, 27–29
security policies, 33
weakest link, personnel as, 20–37
weak procedures, 36–37
weak security policies, 32–36
Business, gaining access to, 127
Buzan, Tony, 314

C

Call
handlers, 242
sales, 245–246
weaponizing, 252–253
Caller ID, 240
issues with, 238
spoofing, 238–239
transferring, 239–240
Camion, 16
Car dealerships, 56
Card cancelation scams, 237–238
CeWL project, 199
Chain of authentication, 48–50
Character, 41, 41–42
Cisco Discovery Protocol (CDP), 274
Client application patching, 221
Cloned web sites, using
to harvest credentials, 230–232
Clydesdale, 16
“Cocktail party effect”, See Selective attention
Common ground, 78
Compliance drivers, 349
Compromising internal systems, 306
Computer Fraud and Abuse Act (CFAA), 104–105
Computer Misuse Act (CMA) 1990 (UK), 100–101
Contact database, 239
Content filtering, 221–222
Conversation, focusing, 77
Corporate website, 155–182
business purpose, 155–156
document metadata, 164–174
document obfuscation, 181–182
e-mail addresses, 156
employee names, 156
partners, clients, vendors, 156
PDFGrep, 180–181
phone numbers, 157
photographic metadata, 174–179
photos of employees and business locations, 157–158
reverse image search engines, 179–180
spidering, 158–164
staff hierarchy, 156–157
Way Back Machine, 182
Cover page, 319–320
Cover stories, 148–149
Craft spear phishing attacks, 127
Craftphishing attacks, 127
Crawlers, 158–164
Credential Harvester Attack Method, 231f
Credential harvesting, 211–212
Credentials and e-mail access, 249–251
Credibility, gaining, 50–51
CuBox, 279
Customer service mentality, 29–31
Cyber security standards, 331–334
“The Cyberspace,”, 333

D

Data classifications, 25–27
Data collection, 312–319
document management tools, 316–319
Mind Mapping, 314–316, 315f
OS folder structures and text editor, 313–314
Data Protection Act, 123, 124
Data storage, 123–124
Defending against phishing attacks, 221
Defining, social engineering, 2–5
Delivery of the report, 325
Denial-of-service (DoS) attacks, 346–347
Departmental awareness and training, 350
Departmental requirements, 348
Departmental risks, 348
Departmental training, 354–355
Development, 349–356
departmental training, 354–355
ensuring impact, 350–351
foundational awareness, 351–352
foundational training, 352–354
individual training, 355–356
Dirty Rotten Scoundrels, 9
Disclaimer page, 320
Disgruntled employees, 81
Distributed denial-of-service (DDoS) attacks, 346–347
DNS records, 197–203
CeWL project, 199
dnsrecon, 197–198
making use of the intel, 201–203
subdomain brute forcing, 198–199
Whois records, 200–201
Dnsrecon, 197–198
Document management tools, 316–319
Document metadata, 164–174
FOCA, 166–170
Metagoofil, 170–173
strings, 165–166
use of, 173–174
Document obfuscation, 181–182
Dradis Pro, 317, 317f
Dropbox, 275–280
building, 280
challenges, 275–276
Intel NUC, 278–279, 279f
Pwnie Express, 276–277
RaspberryPi, 277, 278f
Drucker, Peter, 361
Dumpster diving, 258–259
Dynamic Trunking Protocol (DTP), 274

E

eBay, 16–17
E-mail access, 249–251
E-mail address, 154, 156, 182–191
conventions, 183–191
FOCA, 186
Jigsaw, 187–190
Metagoofil, 186
recon-ng, 190–191
Sam Spade, 187
theharvester, 183–186
Whois, 186–187
insider knowledge, 183
password attacks, 183
phishing attacks, 182–183
E-mail attack vector, 205
active e-mail reconnaissance, 214
American Express, 210–211
Apple ID scam, 211–212
Dr. Atanasoff Gavin, 211
cloned web sites to harvest credentials, 230–232
defending against phishing attacks, 221
human approaches, 222
malicious java applets, 228–230
nondelivery reports (NDRs), 214–215
nonexistent meeting, 216
out-of-office responses, 215–216
phishing attacks, 206
client-side attack, 207–208
working of, 206–208
real-world phishing examples, 210
recruitment consultant, 220
salesperson, 220
SET, 224
setting up your own attack, 222
spear phishing attack vector, 224–227
spear phishing versus trawling, 208
spoofed e-mails versus fake domain names, 223–224
technological approaches, 221–222
weaponizing the scenario, 218
work experience placements, 218
E-mail attack, building on, 240–241, 256–257
E-mail phishing scams, 45
Emotional states, 56–57
Employee names on corporate website, 156
Employee numbers, 248–249
Employee Social Engineering Security Awareness and Education Programming, 328
Ensuring impact, 350–351
Environment, 238
Ethical hacker, 114
Executive summary, 323
Exif (exchangeable image file format) data, 174
Exiftool, 174–175
Exit strategies, 149–150
Exploitation, 107
Extended phishing attacks, 78–80
External assessments, 362
Extrovert, 60
Eye contact, 62

F

Facebook Hacktober, 364–365, 364f
Facebook, 191, 194–195
“Failed” scenario, 150
Fake social media profiles, 74–76
Fear, 46, 57
The Financial Times, 134
Financial Times breach, 14–15
“Finder” in OSX, 180
Firewall, 121, 124, 221–222
Flattery, 77
Fluke Networks, 273
FOCA, 166–170, 186
Foundational awareness, 350, 351–352
Foundational training, 350, 352–354
419 scam, See Advance fee fraud
FoxyProxy settings, 161f
Framing information, 55–56
“Free Internet Access,”, 263
Functionality, 27–29

G

Gavin, Atanasoff, 211
General Data Protection Regulation (GSDR), 91
Generalist, 114
GeoSetter, 177–179
Get out of jail free cards, 110
Goldstein, Noah, 54
GoogleMaps Streetview, 260

H

Hackers, 7–8
Hacktober, 364–365, 364f
Hadnagy, Chris, 12
Hardened policies and procedures, 327–328
background, 328–330
developing fit for, 334–338
resetting user passwords, procedure for, 336–338
industry information security and cyber security standards, 331–334
expected changes, 332–334
social engineering defense, 330–331
Help desk, 247–248
Hobbyists, 131–132
Host based intrusion preventions (HIPS), 221
Huawei E160 USB stick, 287
Human approaches, 222
Human resource security, domain 8, 93–94
Human Rights Act 1998, 103, 103–104
Hydra, 368

I

ID badges, 157–158
Identity of the caller, confirming, 336–337
Ignorance, 77
Image picker, 176–177
Impersonating staff members, 247
Impersonation, 42–44
Implementation, 356–357
outside assistance, 356–357
The Imposter, 9–11
Inconsistencies, 42
Indignation, 46
Indirect referencing, 77
Individual attack vectors, 324–325
Individual awareness and training, 350
Individual impersonation account, 75
Individual training, 355–356
Industry information security, 331–334
Information Assurance (IA) process, 120
Information assurance, 126–129
Information Commissioners Office, 124
Information disclosure policies, 70
Information elicitation, 76–78
Information gathering, See Active information gathering
Information Security Management System (ISMS) document set, 334
Information Security program, 331
Information, use of, 164
Infrastructure testing, 365, 366–368
password auditing, 367–368
vulnerability scanning, 366–367
“Inside man,”, 81
Inside the building, 268–269
Insider knowledge, 183
Intel, 201–203
Intel NUC, 278–279, 279f
Intelligence gathering, 106–107, 372–373
Internal social engineering assessments, 358, 361
designing the internal test, 365–369
testing the infrastructure, 366–368
testing the people and processes they follow, 368–369
need for internal testing, 361–365
Facebook Hacktober, 364–365, 364f
roles, 362–363
Internet service provider (ISP), 41
Introduction, 323
Introvert, 60
Intrusion Prevention Systems (IPS), 308
IPS, 121, 124
ISO/IEC (International Organization for Standardization/International Electrotechnical Commission), 89
ISO/IEC 27000 information security series, 93–95
human resource security, domain 8, 93–94
physical and environmental security, domain 9, 94–95
ISO/IEC 27001/2:2013, 332
ISO/IEC 27002:2013, 332–333

J

Jigsaw puzzle, 51, 51–52
Jigsaw, 187–190
Job enquiries, 244–245
Jung’s theory, 60

K

Kali Linux, 222
Karma attack, 103–104
Kevin Mitnick, 236
Key information and access, obtaining, 249
Keyloggers, 306–307
Kindness, 57
Kratos, 16

L

Lack of awareness and training, 32
Lanyards, 265–266
Legislative considerations, 99–105
Computer Fraud and Abuse Act (CFAA), 104–105
Computer Misuse Act (CMA) 1990 (UK), 100–101
Human Rights Act 1998 (UK), 103–104
Police and Justice Act 2006 (UK), 101
Regulation of Investigatory Powers Act 2000 (UK), 101–102
Leveraging authority, 46–47
Lightning, 16
LinkedIn, 183, 191–192
Linux, 170
Listening, 78
Loading, See Priming
Lock picking, 267–268
Lone gunmen, 131–132
Long-term attack strategies, 71–86
expanding on initial reconnaissance, 72–74
extended phishing attacks, 78–80
fake social media profiles, 74–76
gaining inside help, 81–82
information elicitation, 76–78
long-term surveillance, 83–86
targeting partner companies, 82–83
working at target company, 82
Long-term surveillance, 83–86
Low rent e-mail scam, 213f

M

MAC Address Bypass (MAB), 274
Mac OSX 10.0.4, 165–166
Mail server, 221
Maintenance, 357–358
Malicious hyperlink, 229f
Malicious java applets, 228–230
Malicious USB sticks, 308
Management, role of, 346
Man-In-The-Middle (MITM) attack, 103
Manipulation, techniques of, 39
baiting, 44–45
body language, 60–62
chain of authentication, 48–50
credibility, gaining, 50–51
emotional states, 56–57
framing information, 55–56
from innocuous to sensitive, 51–52
impersonation, 42–44
leveraging authority, 46–47
personality types and models, 58–60
pressure and solution, 45–46
pretexting, 40–42
priming and loading, 52–53
reverse social engineering, 48
selective attention, 57–58
social proof, 53–55
Martin, Steve, 54
Matchstick Men, 8
Metagoofil, 170–173, 186
Metasploit, 226, 307
Microexpression, 62
Microsoft X-Box breach, 15–16
Mind Mapping, 313, 314–316, 315f
Mitnik, Kevin, 11
MS08-067, 366–367

N

Nickerson, Chris, 12
Nigerian Scams, See Advance fee fraud
NIST SP800-30 standard, 127, 128
Noncall handlers, 242
Nondelivery reports (NDRs), 214–215
Nonexistent meeting, 216

O

One-upmanship, 76
The Onion, 134
Onsite Physical Attack, 313, 324–325
Open questions, 77
Open-source intelligence, leveraging, 154–155
corporate website, 155–182
business purpose, 155–156
document metadata, 164–174
e-mail addresses, 156
employee names, 156
partners, clients, vendors, 156
phone numbers, 157
photographic metadata, 174–179
photos of employees and business locations, 157–158
reverse image search engines, 179–180
spidering, 158–164
staff hierarchy, 156–157
Way Back Machine, 182
DNS records, 197–203
CeWL project, 199
dnsrecon, 197–198
making use of the intel, 201–203
subdomain brute forcing, 198–199
Whois records, 200–201
e-mail address, 182–191
conventions, 183–191
insider knowledge, 183
password attacks, 183
phishing attacks, 182–183
social media, 191–196
Facebook, 194–195
LinkedIn, 191–192
Recon-ng, 193–194
Recon-ng, 195–196
Twitter, 195
Open-source reconnaissance, 140
OpenVPN, 299–304
configure, 301
configuring the client, 301–302
install, 300–301
routing issues and how to overcome, 302–304
“OpenWifi,”, 263
Operation Camion, 16–17, 47
Organization impersonation account, 75
Organized crime groups, 131
OS folder structures and text editor, 313–314
OSX, 170
Outbound content filtering, 221–222
Outlook Web Access (OWA), 183, 202, 249
Out-of-office response, 215–216, 241, 241
Oz, Frank, 9

P

Partner companies, targeting, 82–83
Passive Spider, 158–160
Password
attacks, 183
auditing, 367–368
changing, 338
choosing, 337
guidance, 337
metadata, 167–168
protecting, 337
reset procedures, 69–70
Patterns of behavior, 141
Payroll 2014, 45
PCI DSS (Payment Cards Industry Data Security Standard), 89
PDFGrep, 180–181
Penetration Testing Execution Standard (PTES), 105, 371
People and processes, testing of, 365
Personalities, 59
Personality types and models, 58–60
Personnel, as weakest link, 20–37
Phishing attacks, 182–183, 206
client-side attack, 207–208
working of, 206–208
Phishing Email Attack, 313, 324
Phishing e-mails, 45
awareness and training, 70
Phone numbers, 157
Phone scam, 237
Phone survey policies, 70
Phone system hacks, 239
Photographic metadata, 174–179
exiftool, 174–175
GeoSetter, 177–179
image picker, 176–177
wget, 177
Photography, 260–261
Physical access, 251
zero day, 251–252
Physical and environmental security, domain 9, 94–95
Physical attack vector, 255–256
active information gathering, 257–264
dumpster diving, 258–259
photography, 260–261
public access areas, 262–263
reception area, 261–262
rogue access points, 263–264
shoulder surfing, 259–260
badges and lanyards, 265–266
e-mail and telephone attacks, building on, 256–257
inside the building, 268–269
lock picking, 267–268
props and disguises, 264–265
tailgating, 266–267
Physical covert engagement, 142
Planning and design, 346–347
Planning for unknown, 146–150
cover stories, 148–149
exit strategies, 149–150
scenario specific outcomes, 146–148
Plausibility, 248
Plausible situation, 40, 41
Police and Justice Act 2006 (UK), 101
Poor management example, 31
Post exploitation, 107–108
Posture and presence, 62
Power of social influence, See Social proof
Pre-engagement interaction, 106, 372
Pressure and solution, 45–46, 56
Pretext design mapping, 143–146
Pretexting, 40–42
character, 41
plausible situation, 40
Priming, 52–53
Privacy issues, 191
Proactive approach, 330–331
Problem with privileges, 23–25
Props and disguises, 264–265
Proxies, 221–222
Public access areas, 262–263
Pwnie Express, 276–277

Q

Questions
challenging, 148
harmless, 148
Quid pro quo, 78

R

Rainbow, 16
RaspberryPi, 273, 277, 278f
Reactive security awareness and training, See Awareness and training program
Real-world attacks, 13–17, 65
Real-world phishing examples, 210
Receptionists, 261
Reconnaissance, 68, 72–74, 262, 321
physical, 141–142
Recruitment consultant, 220
Regulation of Investigatory Powers Act (RIPA), 102
Regulation of Investigatory Powers Act 2000 (UK), 101–102
Remote connection, 275
3G/4G support, adding, 287–291
configuring SSH tunnels, 284–287
dropbox, 275–280
installing the operating system, 280–283
installing useful tools, 291–292
phoning home, 283–284
screens, wireless, and other hardware, 292–299
Adafruit Pi Plate, 293–296
Alternative dropboxes, 304–306
Compromising internal systems, 306
OpenVPN, 299–304
Wireless dongles, 296–297
useful gadgets, 306–309
Audio recording devices, 307–308
Keyloggers, 306–307
Malicious USB sticks, 308
Teensy USB, 308
WiFi Pineapple, 308–309
Remote Telephone Attack, 313, 324
Report writing, See Writing the report
Reporting, 108, 322
Requests, acceptable sources of, 336
Reverse image search engines, 179–180
Reverse social engineering, 48
RFID, 265–266, 266
Risk management processes, 126–129, 127
Robinson, Phil Alden, 5
Rogue access points, 263–264
RSA 2-factor authentication, 50–51
RSA breach, 13–14
RSMangler, 368

S

Sakis3g, 288
menu, 289f
Sales calls, 245–246
Salespeople, 132, 133, 220
Sam Spade, 187
Scenario creation, 68–69, 321–322
Scenario execution, 68–69, 322
Scenario specific outcomes, 146–148
Scout, 115–116
Scrivener, 317, 317, 318f, 318f
Secure data with vulnerable users, 22–23
Security guards, 47
Security passes, 355
Security, 27–29
Selective attention, 57–58
SET, 224
Setting up your own attack, 222
Short-term attack strategies, 66–71
common short game scenarios, 69–71
targeting the right areas, 66–68
using the allotted time effectively, 68–69
Shoulder surfing, 259–260
Smartronix, 273
Smiling, 61
Sneakers, 5–7
Social engineer, 115
Social engineering cheat sheet, 374f, 375
Social Engineering Defensive Strategy, 327–328
Social engineering engagement, 87
assessment prerequisites, 108–110
contact details, 109
scope limitations, 110
scoping documents, 109
type of testing, 109
business need for, 89–95
compliance and security standards, 90–91
ISO/IEC 27000 information security series, 93–95
payment cards industry data security standard, 91–93
challenges for, 95–98
dealing with unrealistic time frames, 96–97
dealing with unrealistic time scales, 96
less mission impossible, 95–96
more mission improbable, 95–96
name and shame, 97–98
project management, 98
taking one for the team, 97
client, challenges for, 98–99
getting the right people, 98–99
legislative considerations, 99–105
frameworks, 105–108
exploitation, 107
intelligence gathering, 106–107
post exploitation, 107–108
pre-engagement interactions, 106
reporting, 108
threat modeling, 107
key deliverables, 110–113
debrief, 110–112
report, 112–113
operational considerations, 95
team members and skill sets, 114–117
burner, 114–115
ethical hacker, 114
generalist, 114
scout, 115–116
social engineer, 115
thief, 116–117
Social engineering framework, 105–108, 371–374, 372f
execution, 373
exploitation, 107
intelligence gathering, 106–107, 372–373
post-execution, 373
post exploitation, 107–108
pre-engagement interaction, 106, 372
reporting, 108, 373–374
threat modeling, 107, 373
Social engineering methodology, 321–322
reconnaissance, 321
reporting, 322
scenario creation, 321–322
scenario execution, 322
threat modeling, 321
Social engineering overview, 321
Social engineering toolkit (SET), 205
Social engineers, 11–12
Social media, 191–196
Facebook, 194–195
LinkedIn, 191–192
Recon-ng, 193–194, 195–196
Twitter, 195
Social proof, 53–55, 54–55, 55
Social security numbers (SSNs), 15–16
Softley, Iain, 7
Software vulnerability, 366–367
Spam and antivirus products, 221
Spear phishing, 209, 209
attack vector, 224–227
Spidering, 158–164
active, 160–163
passive, 158–160
“Spoofcard,”, 238–239
Spoofed e-mails versus fake domain names, 223–224
“Spookcall,”, 239
Staff awareness and training programs, See Awareness and training program
Staff awareness training, 361–362
Staff hierarchy, 156–157
Staff members, impersonating, 247
State-sponsored/terrorist groups, 130–131
Strings, 165–166
Strong password, 337
Subdomain brute forcing, 198–199
Surveys, 246–247
Swindling money, 44
“Syrian Electronic Army” (SEA), 130, 130–131, 131, 134

T

Table of contents, 320–321
Tailgating, 266–267
Tailgating/piggybacking awareness and training, 71
Target engagement, 142–143
Target profiling, 140–141
Targeted account, 75
Targeted scenarios, creating, 135–136
components of a scenario, 136–138
designing to fail, 150–151
planning for unknown, 146–150
cover stories, 148–149
exit strategies, 149–150
scenario specific outcomes, 146–148
pretext design mapping, 143–146
target identification, 138–143
open-source reconnaissance, 140
physical reconnaissance, 141–142
target engagement, 142–143
target profiling, 140–141
Technological approaches, 221–222
Technology, supporting attack with, 271
attaching to network, 272–275
cable and live port testers, 273
netbooks, 273
port security, 274–275
subnet, 274
remote connection, 275
Adding 3G/4G support, 287–291
configuring SSH tunnels, 284–287
dropbox, 275–280
installing the operating system, 280–283
installing useful tools, 291–292
Phoning home, 283–284
screens, wireless, and other hardware, 292–299
useful gadgets, 306–309
Teensy USB, 308
Teensy, 45
Telephone attack vector, 235
caller ID, 240
issues with, 238
spoofing, 238–239
contact database, 239
credentials and e-mail access, 249–251
e-mail attack, building on, 240–241
employee numbers, 248–249
environment, 238
help desk, 247–248
job enquiries, 244–245
obtaining key information and access, 249
out-of-office response, 241
phone system hacks, 239
physical access, 251
physical access zero day, 251–252
real-world examples, 236–238
card cancelation scams, 237–238
Kevin Mitnick, 236
sales calls, 245–246
staff members, impersonating, 247
surveys, 246–247
transferring caller ID, 239–240
weaponizing your call, 252–253
whom to make a call, 242–244
Telephone attacks, building on, 256–257
Theharvester, 183–186
Thief, 116–117
Third-party help desk, 349
Threat modeling, 107, 119
gaining access to business, 129–133
organized crime groups, 131
state-sponsored/terrorist groups, 130–131
trouble causers, hobbyists, and lone gunmen, 131–132
need for, 120–129
consultant led threat modeling, 122
gaining access to underground bunker data center, 121–122
how approach, 124–126, 125–126
plugging into information assurance and risk management processes, 126–129
what approach, 122, 124
where approach, 123–124, 125
who approach, 123, 125
why approach, 123, 125
Threat modeling, 321, 373
Tiger Team, 5
Time allocation, 68–69
Title page, 320
2Day FM radio prank, 131–132, 134
TP-Link, 252
Traditional learning, 314
Training, effective, 345–358
development, 349–356
departmental training, 354–355
ensuring impact, 350–351
foundational awareness, 351–352
foundational training, 352–354
individual training, 355–356
implementation, 356–357
outside assistance, 356–357
individual departments, 347–349
compliance drivers, 349
departmental requirements, 348
departmental risks, 348
procedures, 349
maintenance, 357–358
planning and design, 346–347
role of management, 346
Trammel, 16
Transferring caller ID, 239–240
Trawling, 208
Trojan horse-style attack, 329
Trouble causers, 131–132
Troy siege, 329
Trust, 57
Twitter, 195

U

Underground bunker data center, gaining access to, 121–122
USB flash drives, 44–45, 45
USB keylogger, 307f
Useful gadgets, 306–309
Audio recording devices, 307–308
Keyloggers, 306–307
Malicious USB sticks, 308
Teensy USB, 308
WiFi Pineapple, 308–309
User passwords, resetting, 336–338
changing passwords, 338
choosing passwords, 337
identity of the caller, confirming, 336–337
password guidance, 337
protecting passwords, 337
requests, acceptable sources of, 336

V

Validating callers over the phone, 355
Visitor/contractor booking procedures, 70
VPN systems, 203
Vulnerability scanning, 366–367

W

Way Back Machine, 182
Weak password, 337
Weak procedures, 36–37
Weak security policies, 32–36
Weaponizing a call, 252–253
Weaponizing the scenario, 218
Wget, 177
Whois records, 186–187, 200–201
WiFi Pineapple, 308–309
Wireless dongles, 296–297
Wireless Pineapple, 252
WolVol BLACK, 273
Work experience placements, 218
Working at target company, 82
Writing the report, 68–69, 319–325
cover page, 319–320
disclaimer page, 320
executive summary, 323
individual attack vectors, 324–325
introduction, 323
social engineering methodology, 321–322
reconnaissance, 321
reporting, 322
scenario creation, 321–322
scenario execution, 322
threat modeling, 321
social engineering overview, 321
table of contents, 320–321
title page, 320

Z

Zed Attack Proxy (ZAP), 160–163
Zero-day exploitation, 15
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset