Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File::Find::find can fail to chdir out of a long length named directory #9001

Open
p5pRT opened this issue Aug 21, 2007 · 4 comments
Open

File::Find::find can fail to chdir out of a long length named directory #9001

p5pRT opened this issue Aug 21, 2007 · 4 comments

Comments

@p5pRT
Copy link

p5pRT commented Aug 21, 2007

Migrated from rt.perl.org#44819 (status was 'open')

Searchable as RT44819$

@p5pRT
Copy link
Author

p5pRT commented Aug 15, 2007

From adavies@ptc.com

Howdy All.

On Windows XP i've seen File​::Find​::find fail with​:

  Can't cd to /dir/xxx/####../../..

where the #'s are a long filename (~256 characters)

[Also note the error message does not include a separating '/'
between the dir_name and ../../..'s, or the $!]

It looks like chdir() is hitting some kind of too long pathname issue.

Here's a testcase that shows the problem​:

# %<
use File​::Find;

chdir '/' or die $!;

my $root_dir = '/long_dir_test';

-d $root_dir || mkdir($root_dir) || die "can't mkdir $root_dir​: $!\n";

# Make a maximally length named directory
# Problem seems to be filename length sensitive
foreach my $dir_name ('d', 'di', 'dir') {
  chdir($root_dir) or die "can't chdir to $root_dir​: $!\n";
  while (mkdir $dir_name) {
  chdir $dir_name or last;
  }
}

chdir('/') || die "can't chdir to /​: $!\n";

File​::Find​::find(sub {}, $root_dir); # this should die
# >%

By changing the code in File​::Find to do multiple individual
chdir("..")'s, it does not seem to hit the problem.
The following minimal patch shows what i mean​:

  while ( defined ($SE = pop @​Stack) ) {
  ($Level, $p_dir, $dir_rel, $nlink) = @​$SE;
  if ($CdLvl > $Level && !$no_chdir) {
+ if ($^O eq 'MSWin32') {
+ my $depth = $CdLvl - $Level;
+ while ($depth--) {
+ chdir("..") or die "Could not cd back to $dir_name​:
$!\n";
+ }
+ } else {
  my $tmp;
  if ($Is_MacOS) {
  $tmp = ('​:' x ($CdLvl-$Level)) . '​:';
@​@​ -939,6 +945,7 @​@​
  }
  die "Can't cd to $dir_name" . $tmp
  unless chdir ($tmp);
+ }
  $CdLvl = $Level;
  }

I'm not sure how this affects performance - probably for the worse :-(
Perhaps a better fix would use the lengths of $cwd and $dir_name
to determine if multiple chdir("..")s were needed.

Cheers, alex.

@p5pRT
Copy link
Author

p5pRT commented Aug 21, 2007

From adavies@ptc.com

Created by adavies@ptc.com

On Windows XP i've seen File​::Find​::find fail with​:

  Can't cd to /dir/####../../..

where the #'s are a long filename (~256 characters)

[Also note the error message does not include a separating '/'
between the dir_name and ../../..'s, or the value of $!, which
happens to be 'No such file or directory'.]

It looks like chdir("../../../..") is hitting some kind of
too long pathname issue.

Here's a testcase that shows the problem​:

# %<
use File​::Find;

chdir '/' or die $!;

my $root_dir = '/long_dir_test';

-d $root_dir || mkdir($root_dir) || die "can't mkdir $root_dir​: $!\n";

# Make a maximally length named directory
# Problem seems to be filename length sensitive
foreach my $dir_name ('d', 'di', 'dir') {
  chdir($root_dir) or die "can't chdir to $root_dir​: $!\n";
  while (mkdir $dir_name) {
  chdir $dir_name or last;
  }
}

chdir('/') || die "can't chdir to /​: $!\n";

File​::Find​::find(sub {}, $root_dir); # this should die
# >%

By changing the code in File​::Find to do multiple individual
chdir("..")'s, it does not seem to hit the problem.
The following minimal patch shows what i mean​:

  while ( defined ($SE = pop @​Stack) ) {
  ($Level, $p_dir, $dir_rel, $nlink) = @​$SE;
  if ($CdLvl > $Level && !$no_chdir) {
+ if ($^O eq 'MSWin32') {
+ my $depth = $CdLvl - $Level;
+ while ($depth--) {
+ chdir("..") or die "Could not cd back to $dir_name​:
$!\n";
+ }
+ } else {
  my $tmp;
  if ($Is_MacOS) {
  $tmp = ('​:' x ($CdLvl-$Level)) . '​:';
@​@​ -939,6 +945,7 @​@​
  }
  die "Can't cd to $dir_name" . $tmp
  unless chdir ($tmp);
+ }
  $CdLvl = $Level;
  }

I'm not sure how this affects performance - probably for the worse :-(
It might be possible to use the value of C< length($cwd) +
length($dir_name) >
to determine if multiple chdir("..")s were needed.

Cheers, alex.

Perl Info

Flags:
    category=library
    severity=low

Site configuration information for perl v5.8.7:

Configured by adavies at Thu Aug 11 14:02:10 2005.

Summary of my perl5 (revision 5 version 8 subversion 7) configuration:
  Platform:
    osname=MSWin32, osvers=5.1, archname=MSWin32-x86-multi-thread
    uname=''
    config_args='undef'
    hint=recommended, useposix=true, d_sigaction=undef
    usethreads=define use5005threads=undef useithreads=define
usemultiplicity=define
    useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
    use64bitint=undef use64bitall=undef uselongdouble=undef
    usemymalloc=n, bincompat5005=undef
  Compiler:
    cc='cl', ccflags ='-nologo -Gf -W3 -MD -DNDEBUG -O1 -DWIN32
-D_CONSOLE -DNO_STRICT -DHAVE_DES_FCRYPT  -DPERL_IMPLICIT_CONTEXT
-DPERL_IMPLICIT_SYS -DUSE_PERLIO -DPERL_MSVCRT_READFIX',
    optimize='-MD -DNDEBUG -O1',
    cppflags='-DWIN32'
    ccversion='12.00.8804', gccversion='', gccosandvers=''
    intsize=4, longsize=4, ptrsize=4, doublesize=8, byteorder=1234
    d_longlong=undef, longlongsize=8, d_longdbl=define, longdblsize=10
    ivtype='long', ivsize=4, nvtype='double', nvsize=8, Off_t='__int64',
lseeksize=8
    alignbytes=8, prototype=define
  Linker and Libraries:
    ld='link', ldflags ='-nologo -nodefaultlib -release
-libpath:"c:\perl3\lib\CORE"  -machine:x86'
    libpth=\lib
    libs=  oldnames.lib kernel32.lib user32.lib gdi32.lib winspool.lib
comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib
netapi32.lib uuid.lib ws2_32.lib mpr.lib winmm.lib  version.lib
odbc32.lib odbccp32.lib msvcrt.lib
    perllibs=  oldnames.lib kernel32.lib user32.lib gdi32.lib
winspool.lib  comdlg32.lib advapi32.lib shell32.lib ole32.lib
oleaut32.lib  netapi32.lib uuid.lib ws2_32.lib mpr.lib winmm.lib
version.lib odbc32.lib odbccp32.lib msvcrt.lib
    libc=msvcrt.lib, so=dll, useshrplib=yes, libperl=perl58.lib
    gnulibc_version='undef'
  Dynamic Linking:
    dlsrc=dl_win32.xs, dlext=dll, d_dlsymun=undef, ccdlflags=' '
    cccdlflags=' ', lddlflags='-dll -nologo -nodefaultlib -release
-libpath:"c:\perl3\lib\CORE"  -machine:x86'

Locally applied patches:
    


@INC for perl v5.8.7:
    C:/perl3/lib
    C:/perl3/site/lib
    .


Environment for perl v5.8.7:
    HOME=C:\alex
    LANG (unset)
    LANGUAGE (unset)
    LD_LIBRARY_PATH (unset)
    LOGDIR (unset)
 
PATH=C:\WINNT\system32;C:\WINNT;C:\WINNT\System32\Wbem;C:\perl3\bin;D:\a
lex\bin;C:\cygwin\bin;C:\Program Files\Perforce;C:\Program
Files\Microsoft Visual Studio\VC98\Bin;C:\Program Files\Microsoft Visual
Studio\Common\MSDev98\Bin
    PERL_BADLANG (unset)
    SHELL (unset)

@p5pRT
Copy link
Author

p5pRT commented Aug 28, 2007

From @rgs

On 21/08/07, via RT Davies, Alex <perlbug-followup@​perl.org> wrote​:

On Windows XP i've seen File​::Find​::find fail with​:

Can't cd to /dir/####../../..

where the #'s are a long filename (~256 characters)

[Also note the error message does not include a separating '/'
between the dir_name and ../../..'s, or the value of $!, which
happens to be 'No such file or directory'.]

It looks like chdir("../../../..") is hitting some kind of
too long pathname issue.

It's probably better to fix it in chdir() instead of putting a
work-around in File​::Find. Can this bug be reproduced without
File​::Find?

@p5pRT
Copy link
Author

p5pRT commented Aug 28, 2007

The RT System itself - Status changed from 'new' to 'open'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants