lists.openwall.net   lists  /  announce  owl-users  owl-dev  john-users  john-dev  passwdqc-users  yescrypt  popa3d-users  /  oss-security  kernel-hardening  musl  sabotage  tlsify  passwords  /  crypt-dev  xvendor  /  Bugtraq  Full-Disclosure  linux-kernel  linux-netdev  linux-ext4  linux-hardening  linux-cve-announce  PHC 
Open Source and information security mailing list archives
 
Hash Suite: Windows password security audit tool. GUI, reports in PDF.
[<prev] [next>] [day] [month] [year] [list]
Message-ID: <200307301652.57294.Ken__48127.8931176805$1059667808@KRvW.com>
Date: Wed, 30 Jul 2003 16:52:57 -0400
From: "Kenneth R. van Wyk" <Ken@...W.com>
To: bugtraq@...urityfocus.com, NTBugtraq@...tserv.ntbugtraq.com
Subject: Vulnerability analysis site


 For those interested, my co-author (Mark Graff) and I have been posting and
 maintaining a free repository of analyses of some recent/topical
 vulnerabilities on our book's web page, at http://www.securecoding.org --
 you can alternatively go directly to the analyses at
 http://www.securecoding.org/companion/analysis/.
 
 In these periodic "columns", we analyze the root causes (no pun intended) of
 some recent vulnerabilities and discuss ways of preventing similar mistakes
 in the future.  We also look at where in the development process (e.g.,
 architecture, design, implementation, operations) the flaws were likely to
 have been introduced.  I've attached our most recent write-up below as an
 example.
 
 Although the book isn't free, we're maintaining the web site* as a free
 "book companion" resource.  No registration, subscription, purchase, 
 etc., is necessary.
 
 * The web site is graciously hosted by our friends at http://www.nidhog.com.
 
 Cheers,
 
 Ken van Wyk
 Mark Graff
 Co-authors of "Secure Coding: Principles and Practices" (O'Reilly, 2003)
 
 =====
 
 Copyright (C) 2003, Mark G. Graff and Kenneth R. van Wyk. Permission granted
 to reproduce and distribute in entirety with credit to authors.
 
 29 July 2003
 
 We figure our readers must know as much about new gadgets as any group
 in the world. So we are going to ask you to keep an eye out for one
 (we'll describe it later) that we think might prevent vulnerabilities
 like the one under the microscope today.
 
 The flaw was first reported by security firm ThreeZee. The full text
 of their advisory is available at
 http://www.threezee.com/sections/security/tzt001.txt.  (As always, we
 encourage you to read the original advisory in full. There's always
 more to the story than we cover here.)
 
 ThreeZee points out a problem with a particular mobile phone service
 provider's messaging software.  It turns out that any visitor to the
 provider's web site can predict the ID numbers associated with text
 messages, also known as Short Message Service (SMS) messages. That
 simple ability opens up a gaping security hole. You could (but please
 don't) obtain for yourself the delivery reports intended for message
 senders. You could get the email addresses of the recipients, too.
 
 What's the big deal about that?  Well, by combining that information
 with a couple of other flaws, an attacker could eavesdrop on new text
 messages sent to the cell phones. Potentially, one could gain at least
 partial control of a cell phone account.  It's a classic example of
 step-by-step system compromise, where each new plateau reached yields
 information making further compromise possible. If we were the
 each-new-dawn-a-miracle type, we would call it beautiful.
 
 But the bug itself is quite a curiosity, too. The principal flaw:
 message ID's are coined in a predictable sequence. Once you know one,
 you can deduce a practically unlimited number of them; and knowing
 those message ID's unlocks all that information you're not supposed to
 be able to get to. Here's how the advisory explains the prediction
 method.
 
         "While the Tracking, or message ID may look foreign in ways,
         it's quite simple.
  
         Think of the way an odometer turns on a car. That is the basic
         idea of the ID.
  
         Example 1: MsgID4_A54GKVHD 
  
         Example 2: MsgID4_3M5GKVHD
 
         Starting after the '_', the message ID will progress in the
         order of A - Z, and 0 – 9. There seems to be no association
         with the time sent, or who it was sent to. Like the odometer,
         when a character/digit of the ID reaches the end (9), it will
         restart at A, and the preceding character will increase by 1."
 
 Does this seem familiar? Where have we seen this before?
 
 Well, for starters, Robert Tappan Morris described a similar
 vulnerability in his 1985 paper [1] at AT&T Bell Labs. The problem he
 unearthed there had to do with predicting sequence numbers used in the
 TCP protocol. In Chapter 4 of Secure Coding, we cite a conceptually
 similar problem [2] in version 4 of MIT's Kerberos system. In that
 case, the designers really tried to make the initial sequence numbers
 "random" (hence, unpredictable) but still came up short.
 
 Now those design errors were made in the 70's and '80s, the bad old
 days. How could such a problem get introduced into a web site "in this
 day and age"?  Well, it's easy, really. Let's ask instead: how could
 it have been avoided?
 
 
 WHAT CAN GO WRONG?
 
 In some of the better Software Engineering curricula we are familiar
 with, the value and power of a process known as "domain analysis" is
 taught. It's basically a fancy way to learn from the mistakes of
 others.  The point is to locate, study and analyze during the design
 process similar problems that cropped up in earlier projects. We
 recommend the practice. (In fact, as we write this, one of us is
 undertaking just such a study for an important design we are
 participating in.)
 
 There's not always time for domain analysis. Even if there were, our
 profession is so new--and the world we deal with still so
 abstract--that we don't have the great body of disasters other
 engineers do to draw upon for inspiration. (To get a start, you might
 try Perrow [3], Neumann [4], and Reason [5] for stories, respectively,
 of catastrophic engineering errors; computer-related disasters; and
 common human errors in risk evaluation. In addition to Peter Neumann's
 above book, his superb on-line RISKS Digest,
 http://catless.ncl.ac.uk/Risks/, is a great forum to study and learn.) 
 So what is the best way for a programmer operating under real-world
 constraints to identify lurking design-level errors?
 
 Train yourself to ask the question, "What can go wrong?"  When
 designing a piece of software, the design team should be considering
 the ramifications of their design choices from exactly that
 perspective.  What would happen if someone were able to guess the
 value of any arbitrary message ID on the text-messaging portal?  What
 could an attacker do with that information?  It's our experience that
 once you start down that road, you'll often find yourself rooting out
 one potential design weakness after another. In this case, maybe it
 would have sufficed to pass onto the implementer a note that message
 ID's should be reasonably unpredictable.
 
 
 A MIRACLE OF A RARE DEVICE
 
 We opened this analysis with a mention of a device we had an idea for
 that could help prevent vulnerabilities like this.
 
 We would like this gizmo to hover above our shoulder all of the
 time. (Or you could build it into, let's say, the kind of pith helmet
 worn by jungle explorers. That would be OK.) Our main requirement is
 that it must sound a loud gong whenever we need to ask the question,
 "What can go wrong?" Once a project should be enough. Hey, by the way,
 solar power would be a neat add-on feature.
 
 Until we have one, we'll try to remember to ask the question
 ourselves--or, better yet, use a checklist to help remind us to ask
 the questions.
 
 Cheers,
 
 Mark G. Graff
 Kenneth R. van Wyk
 29 July 2003
 
 
 REFERENCES
 
 [1] Morris, Robert T. "A Weakness in the 4.2BSD Unix† TCP/IP
 Software". AT&T Bell Laboratories,
 1985. http://www.pdos.lcs.mit.edu/~rtm/papers/117.pdf
 [2] Dole, Bryn, Steve Lodin, and Eugene Spafford. "Misplaced Trust:
 Kerberos 4 Session Keys." Proceedings of the 1997 ISOC
 Conference. 1997. Available online at
 http://www.isoc.org/isoc/conferences/ndss/97/dole_sl.pdf.
 [3] Perrow, Charles. Normal Accidents. New York, NY: Princeton
 University Press, 1999. ISBN 0691004129.
 [4] Neumann, Peter. Computer-Related Risks, New York:
 Addison-Wesley/ACM Press, 1995. ISBN 0-201-55805-X.
 [5] Reason, James. Human Error. New York: Cambridge University Press,
 1990. ISBN 052131494.

-- 
KRvW Associates, LLC
IT Security Consulting and Training
http://www.KRvW.com
+1 703 981 7746



Powered by blists - more mailing lists

Powered by Openwall GNU/*/Linux Powered by OpenVZ