Skip to content
Snippets Groups Projects
  • Simon Tatham's avatar
    70fd577e
    Fall back to not sorting large dirs in pscp -ls or psftp 'ls'. · 70fd577e
    Simon Tatham authored
    This mitigates a borderline-DoS in which a malicious SFTP server sends
    a ludicrously large number of file names in response to a SFTP
    opendir/readdir request sequence, causing the client to buffer them
    all and use up all the system's memory simply so that it can produce
    the output in sorted order.
    
    I call it a 'borderline' DoS because it's very likely that this is the
    same server that you'll also trust to actually send you the _contents_
    of some entire file or directory, in which case, if they want to DoS
    you they can do that anyway at that point and you have no way to tell
    a legit very large file from a bad one. So it's unclear to me that
    anyone would get any real advantage out of 'exploiting' this that they
    couldn't have got anyway by other means.
    
    That said, it may have practical benefits in the occasional case.
    Imagine a _legit_ gigantic directory (something like a maildir,
    perhaps, and perhaps stored on a server-side filesystem specialising
    in not choking on really huge single directories), together with a
    client workflow that involves listing the whole directory but then
    downloading only one particular file in it.
    
    For the moment, the threshold size is fixed at 8Mb of total data
    (counting the lengths of the file names as well as just the number of
    files). If that needs to become configurable later, we can always add
    an option.
    70fd577e
    History
    Fall back to not sorting large dirs in pscp -ls or psftp 'ls'.
    Simon Tatham authored
    This mitigates a borderline-DoS in which a malicious SFTP server sends
    a ludicrously large number of file names in response to a SFTP
    opendir/readdir request sequence, causing the client to buffer them
    all and use up all the system's memory simply so that it can produce
    the output in sorted order.
    
    I call it a 'borderline' DoS because it's very likely that this is the
    same server that you'll also trust to actually send you the _contents_
    of some entire file or directory, in which case, if they want to DoS
    you they can do that anyway at that point and you have no way to tell
    a legit very large file from a bad one. So it's unclear to me that
    anyone would get any real advantage out of 'exploiting' this that they
    couldn't have got anyway by other means.
    
    That said, it may have practical benefits in the occasional case.
    Imagine a _legit_ gigantic directory (something like a maildir,
    perhaps, and perhaps stored on a server-side filesystem specialising
    in not choking on really huge single directories), together with a
    client workflow that involves listing the whole directory but then
    downloading only one particular file in it.
    
    For the moment, the threshold size is fixed at 8Mb of total data
    (counting the lengths of the file names as well as just the number of
    files). If that needs to become configurable later, we can always add
    an option.