Finding recent files and copy them into another folder

J

Jorge Reyes

Hi everyone,

I have this issue which is killing me, this is the scenario

if i make ls -l i get this:
-rw-r--r-- 1 netrac netrac 2928 jul 8 18:47 file1.pl
-rw-r--r-- 1 netrac netrac 2928 jul 8 19:07 file2.pl
-rw-r--r-- 1 netrac netrac 2928 jul 8 19:10 file3.pl

now what i need is only the files were accessed >= 19:00 i.e.
-rw-r--r-- 1 netrac netrac 2928 jul 8 19:07 file2.pl
-rw-r--r-- 1 netrac netrac 2928 jul 8 19:10 file3.pl

i found this example which create a rule to find only files (and not
directories or other things) that have been accessed at least 14 days
ago

use File::Find::Rule;
my $path = "/data";
my @results = File::Find::Rule->file->ctime('>14')->in($path);
foreach my $i (@results) {
print "$i\n";
}

Question is, instead 14 days ago, a specific time i.e. >= current
system date ????

Thanks in advanced
 
C

C.DeRykus

Hi everyone,

I have this issue which is killing me, this is the scenario

if i make   ls -l     i get this:
-rw-r--r--   1 netrac   netrac      2928 jul   8 18:47 file1.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl

now what i need is only the files were accessed >= 19:00  i.e.
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl

i found this example which create a rule to find only files (and not
directories or other things) that have been accessed at least 14 days
ago

use File::Find::Rule;
my $path = "/data";
my @results = File::Find::Rule->file->ctime('>14')->in($path);
foreach my $i (@results) {
        print "$i\n";

}

Question is, instead 14 days ago, a specific time i.e. >= current
system date ????


[untested]

use File::Find::Rule;
use Time::Local;

my $target = timelocal( 0, 0, 0, 19, 6, 2009 );
my @results = File::Find::Rule
->file
->exec(
sub{ (stat($_))[10] - $target < 0
? $_ : ''
} )
->in($path);
 
J

Jorge Reyes

Hi everyone,
I have this issue which is killing me, this is the scenario
if i make   ls -l     i get this:
-rw-r--r--   1 netrac   netrac      2928 jul   8 18:47 file1.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl
now what i need is only the files were accessed >= 19:00  i.e.
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:07 file2.pl
-rw-r--r--   1 netrac   netrac      2928 jul   8 19:10 file3.pl
i found this example which create a rule to find only files (and not
directories or other things) that have been accessed at least 14 days
ago
use File::Find::Rule;
my $path = "/data";
my @results = File::Find::Rule->file->ctime('>14')->in($path);
foreach my $i (@results) {
        print "$i\n";

Question is, instead 14 days ago, a specific time i.e. >= current
system date ????

[untested]

use File::Find::Rule;
use Time::Local;

my $target = timelocal( 0, 0, 0, 19, 6, 2009 );
my @results = File::Find::Rule
                ->file
                ->exec(
                  sub{ (stat($_))[10] - $target < 0
                           ? $_   : ''
                     }  )
                ->in($path);

Hi everyone,

Thanks for your response C.DeRykus but i have a lot of questions,

first of all the target of this script is parse the files stored in a
specific folder and then copy into another folder, these files are
constantly emerging and the script should detect them, so, my logic
is:

First time i run the script, it takes the files from this moment
Parse the files one by one
At the end of each parse, copy the file to another folder
Save the max ctime of all the parsed files
Sleep 10 minutes and then wake up and look up for new files (taking
the saved datetime as started point)
If there are new files then repit the proccess, else Sleep another 10
minutes.

Am I greatly complicated? please give me some ideas :(

Thank you so much all of you!!
 
J

Jim Gibson

Hi everyone,

Thanks for your response C.DeRykus but i have a lot of questions,

first of all the target of this script is parse the files stored in a
specific folder and then copy into another folder, these files are
constantly emerging and the script should detect them, so, my logic
is:

First time i run the script, it takes the files from this moment
Parse the files one by one
At the end of each parse, copy the file to another folder
Save the max ctime of all the parsed files
Sleep 10 minutes and then wake up and look up for new files (taking
the saved datetime as started point)
If there are new files then repit the proccess, else Sleep another 10
minutes.

Am I greatly complicated? please give me some ideas :(

How long does the copy exist? If you parse the file, then copy it, you
can use the presence of the copy to indicate that the file has been
parsed. Otherwise, you may encounter some race condition in which you
miss files because they appear after your check, but with a ctime
earlier than your max ctime of the files you are processing.

You could also move the original file to some other directory after it
has been successfully processed, if that does not interfere with some
other function.
 
J

Jorge Reyes

How long does the copy exist? If you parse the file, then copy it, you
can use the presence of the copy to indicate that the file has been
parsed. Otherwise, you may encounter some race condition in which you
miss files because they appear after your check, but with a ctime
earlier than your max ctime of the files you are processing.

You could also move the original file to some other directory after it
has been successfully processed, if that does not interfere with some
other function.

Hi thank you Jim,

Well, in fact that could be an alternative i mean, the idea will be:

1. Run the script and copy all the files which last created time is
the last 10 minutes.
2. Change dir to that "work folder" and parse the files one by one (at
this step, it would be very useful sort the files by ctime)
3. If the files were sorted, then the last parsed file contain the the
max ctime, so save it into $lastparsetime.
4. Sleep 10 minutes.
5. Wake up and look up for new files (taking the $lastparsetime as
started point), if exists then repit the proccess, else Sleep another
10 minutes.

This is what i have, but this is not doing the above, any help
please????

use strict;
use File::Find::Rule;
use Time::Local;

my $DIR = "/home/netrac/BSMPERL";
my $DIRW = "/raw_data/COMPLETED/BSM";
my $DIRBIN = "$DIR/bin";
my $DIRLOG = "$DIR/log";
my $DIRDAT = "$DIR/dat";
my $LOGFILE = "$DIRLOG/BSMLog.log";


my $target = timelocal( 0, 0, 0, 8, 7, 2009 );
my @results = File::Find::Rule
->file
->name('MCBTSSubsystem-*.gz')
->exec( sub{ (stat($_))[10] - $target > 0 ? $_ :
'' } )
->in($DIRW);

foreach my $i (@results) {
print "$i\n";
}
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top