Finding recent files and copy them into another folder

Discussion in 'Perl Misc' started by Jorge Reyes, Jul 9, 2009.

  1. Jorge Reyes

    Jorge Reyes Guest

    Hi everyone,

    I have this issue which is killing me, this is the scenario

    if i make ls -l i get this:
    -rw-r--r-- 1 netrac netrac 2928 jul 8 18:47 file1.pl
    -rw-r--r-- 1 netrac netrac 2928 jul 8 19:07 file2.pl
    -rw-r--r-- 1 netrac netrac 2928 jul 8 19:10 file3.pl

    now what i need is only the files were accessed >= 19:00 i.e.
    -rw-r--r-- 1 netrac netrac 2928 jul 8 19:07 file2.pl
    -rw-r--r-- 1 netrac netrac 2928 jul 8 19:10 file3.pl

    i found this example which create a rule to find only files (and not
    directories or other things) that have been accessed at least 14 days
    ago

    use File::Find::Rule;
    my $path = "/data";
    my @results = File::Find::Rule->file->ctime('>14')->in($path);
    foreach my $i (@results) {
    print "$i\n";
    }

    Question is, instead 14 days ago, a specific time i.e. >= current
    system date ????

    Thanks in advanced
     
    Jorge Reyes, Jul 9, 2009
    #1
    1. Advertising

  2. Jorge Reyes

    C.DeRykus Guest

    On Jul 8, 5:14 pm, Jorge Reyes <> wrote:
    > Hi everyone,
    >
    > I have this issue which is killing me, this is the scenario
    >
    > if i make   ls -l     i get this:
    > -rw-r--r--   1 netrac   netrac      2928 jul   8 18:47 file1.pl
    > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
    > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl
    >
    > now what i need is only the files were accessed >= 19:00  i.e.
    > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
    > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl
    >
    > i found this example which create a rule to find only files (and not
    > directories or other things) that have been accessed at least 14 days
    > ago
    >
    > use File::Find::Rule;
    > my $path = "/data";
    > my @results = File::Find::Rule->file->ctime('>14')->in($path);
    > foreach my $i (@results) {
    >         print "$i\n";
    >
    > }
    >
    > Question is, instead 14 days ago, a specific time i.e. >= current
    > system date ????



    [untested]

    use File::Find::Rule;
    use Time::Local;

    my $target = timelocal( 0, 0, 0, 19, 6, 2009 );
    my @results = File::Find::Rule
    ->file
    ->exec(
    sub{ (stat($_))[10] - $target < 0
    ? $_ : ''
    } )
    ->in($path);

    --
    Charles DeRykus
     
    C.DeRykus, Jul 9, 2009
    #2
    1. Advertising

  3. Jorge Reyes

    Jorge Reyes Guest

    On 9 jul, 00:16, "C.DeRykus" <> wrote:
    > On Jul 8, 5:14 pm, Jorge Reyes <> wrote:
    >
    >
    >
    >
    >
    > > Hi everyone,

    >
    > > I have this issue which is killing me, this is the scenario

    >
    > > if i make   ls -l     i get this:
    > > -rw-r--r--   1 netrac   netrac      2928 jul   8 18:47 file1.pl
    > > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
    > > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl

    >
    > > now what i need is only the files were accessed >= 19:00  i.e.
    > > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
    > > -rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl

    >
    > > i found this example which create a rule to find only files (and not
    > > directories or other things) that have been accessed at least 14 days
    > > ago

    >
    > > use File::Find::Rule;
    > > my $path = "/data";
    > > my @results = File::Find::Rule->file->ctime('>14')->in($path);
    > > foreach my $i (@results) {
    > >         print "$i\n";

    >
    > > }

    >
    > > Question is, instead 14 days ago, a specific time i.e. >= current
    > > system date ????

    >
    > [untested]
    >
    > use File::Find::Rule;
    > use Time::Local;
    >
    > my $target = timelocal( 0, 0, 0, 19, 6, 2009 );
    > my @results = File::Find::Rule
    >                 ->file
    >                 ->exec(
    >                   sub{ (stat($_))[10] - $target < 0
    >                            ? $_   : ''
    >                      }  )
    >                 ->in($path);
    >
    > --
    > Charles DeRykus


    Hi everyone,

    Thanks for your response C.DeRykus but i have a lot of questions,

    first of all the target of this script is parse the files stored in a
    specific folder and then copy into another folder, these files are
    constantly emerging and the script should detect them, so, my logic
    is:

    First time i run the script, it takes the files from this moment
    Parse the files one by one
    At the end of each parse, copy the file to another folder
    Save the max ctime of all the parsed files
    Sleep 10 minutes and then wake up and look up for new files (taking
    the saved datetime as started point)
    If there are new files then repit the proccess, else Sleep another 10
    minutes.

    Am I greatly complicated? please give me some ideas :(

    Thank you so much all of you!!
     
    Jorge Reyes, Jul 9, 2009
    #3
  4. Jorge Reyes

    Jim Gibson Guest

    In article
    <>,
    Jorge Reyes <> wrote:


    > Hi everyone,
    >
    > Thanks for your response C.DeRykus but i have a lot of questions,
    >
    > first of all the target of this script is parse the files stored in a
    > specific folder and then copy into another folder, these files are
    > constantly emerging and the script should detect them, so, my logic
    > is:
    >
    > First time i run the script, it takes the files from this moment
    > Parse the files one by one
    > At the end of each parse, copy the file to another folder
    > Save the max ctime of all the parsed files
    > Sleep 10 minutes and then wake up and look up for new files (taking
    > the saved datetime as started point)
    > If there are new files then repit the proccess, else Sleep another 10
    > minutes.
    >
    > Am I greatly complicated? please give me some ideas :(


    How long does the copy exist? If you parse the file, then copy it, you
    can use the presence of the copy to indicate that the file has been
    parsed. Otherwise, you may encounter some race condition in which you
    miss files because they appear after your check, but with a ctime
    earlier than your max ctime of the files you are processing.

    You could also move the original file to some other directory after it
    has been successfully processed, if that does not interfere with some
    other function.

    --
    Jim Gibson
     
    Jim Gibson, Jul 9, 2009
    #4
  5. Jorge Reyes

    Jorge Reyes Guest

    On 9 jul, 15:16, Jim Gibson <> wrote:
    > In article
    > <>,
    >
    >
    >
    >
    >
    > Jorge Reyes <> wrote:
    > > Hi everyone,

    >
    > > Thanks for your response C.DeRykus but i have a lot of questions,

    >
    > > first of all the target of this script is parse the files stored in a
    > > specific folder and then copy into another folder, these files are
    > > constantly emerging and the script should detect them, so, my logic
    > > is:

    >
    > > First time i run the script, it takes the files from this moment
    > > Parse the files one by one
    > > At the end of each parse, copy the file to another folder
    > > Save the max ctime of all the parsed files
    > > Sleep 10 minutes and then wake up and look up for new files (taking
    > > the saved datetime as started point)
    > > If there are new files then repit the proccess, else Sleep another 10
    > > minutes.

    >
    > > Am I greatly complicated? please give me some ideas :(

    >
    > How long does the copy exist? If you parse the file, then copy it, you
    > can use the presence of the copy to indicate that the file has been
    > parsed. Otherwise, you may encounter some race condition in which you
    > miss files because they appear after your check, but with a ctime
    > earlier than your max ctime of the files you are processing.
    >
    > You could also move the original file to some other directory after it
    > has been successfully processed, if that does not interfere with some
    > other function.
    >
    > --
    > Jim Gibson


    Hi thank you Jim,

    Well, in fact that could be an alternative i mean, the idea will be:

    1. Run the script and copy all the files which last created time is
    the last 10 minutes.
    2. Change dir to that "work folder" and parse the files one by one (at
    this step, it would be very useful sort the files by ctime)
    3. If the files were sorted, then the last parsed file contain the the
    max ctime, so save it into $lastparsetime.
    4. Sleep 10 minutes.
    5. Wake up and look up for new files (taking the $lastparsetime as
    started point), if exists then repit the proccess, else Sleep another
    10 minutes.

    This is what i have, but this is not doing the above, any help
    please????

    use strict;
    use File::Find::Rule;
    use Time::Local;

    my $DIR = "/home/netrac/BSMPERL";
    my $DIRW = "/raw_data/COMPLETED/BSM";
    my $DIRBIN = "$DIR/bin";
    my $DIRLOG = "$DIR/log";
    my $DIRDAT = "$DIR/dat";
    my $LOGFILE = "$DIRLOG/BSMLog.log";


    my $target = timelocal( 0, 0, 0, 8, 7, 2009 );
    my @results = File::Find::Rule
    ->file
    ->name('MCBTSSubsystem-*.gz')
    ->exec( sub{ (stat($_))[10] - $target > 0 ? $_ :
    '' } )
    ->in($DIRW);

    foreach my $i (@results) {
    print "$i\n";
    }
     
    Jorge Reyes, Jul 9, 2009
    #5
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Anonieko

    HttpHandlers - Learn Them. Use Them.

    Anonieko, Jun 15, 2006, in forum: ASP .Net
    Replies:
    5
    Views:
    561
    tdavisjr
    Jun 16, 2006
  2. Replies:
    26
    Views:
    2,175
    Roland Pibinger
    Sep 1, 2006
  3. SP
    Replies:
    0
    Views:
    461
  4. Uno
    Replies:
    15
    Views:
    631
    Keith Thompson
    Sep 3, 2010
  5. why the lucky stiff
    Replies:
    5
    Views:
    168
    why the lucky stiff
    Sep 22, 2004
Loading...

Share This Page