Kevin D. Mitnick's The art of deception

Arseni Mourzenko
Founder and lead developer
177
articles
March 26, 2018
Tags: short 50 security 8

Just fin­ished read­ing Kevin D. Mit­nick's The art of de­cep­tion. The first part of the book con­tains the sto­ries of so­cial en­gi­neers in ac­tion. The sec­ond part is a set of rec­om­men­da­tions.

The sto­ries are fun to read; the rec­om­men­da­tions are some­times use­ful too, but there is a caveat. A tran­scend­ing idea be­hind them is that in or­der to pro­tect one­self from so­cial en­gi­neers, one has to be very care­ful when re­leas­ing in­for­ma­tion to the pub­lic. Things such as the names of the servers, or the con­tact in­for­ma­tion of the em­ploy­ees or the or­ga­ni­za­tion chart should be, ac­cord­ing to Kevin, kept with­in the com­pa­ny—any­thing which is reach­able for a so­cial en­gi­neer could sim­pli­fy his con.

This is not true.

This is ex­act­ly as telling that pro­pri­etary soft­ware prod­ucts are more se­cure than open source al­ter­na­tives, be­cause an at­tack­er can­not read the source code and spot the vul­ner­a­bil­i­ties.

I al­ready ex­plained that the gap be­tween, on one side, the un­safe out­side world full of scoundrels, and, on the oth­er side, the world with­in the com­pa­ny, where every­one has only one thing on his mind—the pros­per­i­ty of the cor­po­ra­tion—doesn't ex­ist. In so­cial en­gi­neer­ing, it is also true. If your se­cu­ri­ty is based on the se­cre­cy of a few phone num­bers and names of servers, then what do you do as soon as a dis­grun­tled em­ploy­ee is fired? Do you re­name all your servers and ask all your em­ploy­ees to change their first and fam­i­ly names?

So by act­ing like the in­for­ma­tion is nev­er dis­closed out­side the com­pa­ny, you're only in­creas­ing the risk of a so­cial en­gi­neer­ing con. It's sim­ple: since em­ploy­ees be­lieve the in­for­ma­tion is clas­si­fied, a so­cial en­gi­neer who ac­quired this in­for­ma­tion from an in­sid­er or from a per­son who left the com­pa­ny au­to­mat­i­cal­ly ac­quires a pow­er­ful mean to make some­one be­lieve that he's an in­sid­er—that is, ex­act­ly the thing one wants to avoid. Telling: “Can you copy for me the fol­low­ing files from STNB-27-C, please?” will not be in­ter­pret­ed the same if the name of the serv­er is known to be pub­lic, or be­lieved to be clas­si­fied. (By the way, STNB-27-C is a ter­ri­ble name for a serv­er. Don't do that.)

This means one and one only thing: if you can't en­sure the con­fi­den­tial­i­ty of a piece of data, ei­ther make it pub­lic, or at least don't ad­ver­tise it as clas­si­fied. Maybe, by lack of bet­ter ap­proach­es, this was a good thing to do in 2002, when the book was writ­ten; I don't know that. In any case, it is not a good ap­proach any longer to­day. In­for­ma­tion se­cu­ri­ty should rely on proven pro­ce­dures; both tech­ni­cal pro­ce­dures, which in­clude cryp­tog­ra­phy, as well as pro­ce­dures for hu­mans to fol­low.

En­sur­ing each ma­chine and each ap­pli­ca­tion are hos­tile to every­thing from the out­side, like I sug­gest­ed in the ar­ti­cle I quot­ed at the be­gin­ning of this one, is there­fore a wise ap­proach. It doesn't mat­ter for a spe­cif­ic serv­er whether the per­son who tries to ac­cess it is a guy from So­ma­lia, or just an ap­pli­ca­tion writ­ten by the same team and host­ed on a neigh­bor serv­er; both are treat­ed the same, be­cause both are pub­lic.

Aside this lit­tle point where I dis­agree with Kevin, the book it­self is great and in­ter­est­ing, full of con­crete ex­am­ples of things which could go wrong be­cause of the hu­man fac­tor. Make sure you read it.