How to count number of files available in a directory recursively in Python using rsync? |
I would use a different approach using fabric, which is a great tool for
executing remote commands.
from fabric.api import run, env
env.host_string = 'example.org'
output = run('find /tmp -type f | wc -l')
num_files = int(output)
Now you have the number of files in your variable num_files. I was just
using the find command to search for files recursively beginning at
directory /tmp, and counted the returned lines with wc -l.
|
what's the best data structure for pattern matching within a vocabulary? |
What you want is a suffix tree. This stores all suffixes of a (set of)
strings in a trie (in your case, the set of words). Each leaf of the trie
is associated with the the set of strings that have that suffix.
When searching for a substring, you simply match the substring at the root
of trie; your substring must a prefix of some suffix or there is no match.
Discovering the existence of a match is linear time in the length of the
substring. To determine all matching words, you have to enumerate all
leaves of the trie accessible from the point where the match completes.
That is a tree walk problem; if the tree has significant branching, it
might be bit expensive.
You could precompute, for each trie node, the set of associated words; this
is likely to be pretty big, but now you have an
|
keeping last 7 days zip files and deleting rest all files from a directory |
Do you really need to sort and all? If you need to delete files which are
more than 7 days old, get the lastModifieddate and substract it from
current time. if the difference is more than 7*24*60*60 seconds then you
can delete it.
All you need is a for loop after the f.listFiles() line. Not actual code -
use to get to the working code.
long timeInEpoch = System.currentTimeMillis(); // slightly faster than new
Date().getTimeInMillis();
File f = new File("/tmp");
if (f.isDirectory()) {
final File[] files = f.listFiles();
for(int i =0; i < files.length ; i++ ) {
if( timeInEpoch - f.lastModifiedDate() > 1000*60*60*24*7 )
files[i].delete();
}
System.out.println(fileList);
}
|
deleting all zip files in target directory and only keeping latest two zip files |
Not a optimized one but you can use.
String[] extensions = {"zip","rar"};
Collection<File> fileList =
listFiles(FilePath2,extensions,false);
File[] fileArray = new File[fileList.size()];
int x = 0;
for (Iterator iterator = fileList.iterator(); iterator.hasNext();) {
fileArray[x] = (File) iterator.next();
x++;
}
File temp;
for(int i=1; i< fileArray.length ; i++) {
for(int j=0; j<fileArray.length-1; j++) {
if(fileArray[j].lastModified() >
fileArray[j+1].lastModified()) {
temp = fileArray [j];
fileArray [j] = fileArray [j+1];
fileArray [j+1] = temp;
}
}
}
for(int i=0; i< fileArray.length-2 ; i++) {
deleteQuietly(file
|
Searching files by matching filename pattern and concatenating contents of the files |
for the second NP_len_*.fa pattern the regex can be like
.+NP_len_d{1,3}.fa
and for the first one where you do not want the N us this
.+?[^N]P_len_d{1,3}.fa
this one will match all patterns just except N before P. I have considered
that folder names might grow in future about you xaa part. you can
alternatively match for string of length 3 also.
|
keeping original Vector intact when modifying it's copy |
you need to do deep cloning, A simple way is to serialize the vector and
desalinize it again. You will get the expected result.
Note: Objects in the vector should be serializable
(or)
create a new vector, Then iterate over the existing vector and colne each
object in the vector and add to new it to new vector
|
rsync recursively and exclude content of specific directory, not the directory |
In recent versions of rsync, you can use the -F option and put a file
".rsync-filter" in the directory src, containing:
- dir1/***
That seemed to work for me. I'm assuming that your hierarchy above is all
under "src/".
|
Disable maximize button of WPF window, keeping resizing feature intact |
WPF does not have the native capability to disable the Maximize button
alone, as you can do with WinForms. You will need to resort to a WinAPI
call. It's not scary:
[DllImport("user32.dll")]
private static extern int GetWindowLong(IntPtr hWnd, int nIndex);
[DllImport("user32.dll")]
private static extern int SetWindowLong(IntPtr hWnd, int nIndex, int
dwNewLong);
private const int GWL_STYLE = -16;
private const int WS_MAXIMIZEBOX = 0x10000;
private void Window_SourceInitialized(object sender, EventArgs e)
{
var hwnd = new WindowInteropHelper((Window)sender).Handle;
var value = GetWindowLong(hwnd, GWL_STYLE);
SetWindowLong(hwnd, GWL_STYLE, (int)(value & ~WS_MAXIMIZEBOX));
}
|
How to select all files of a given filetype EXCEPT ones matching a name pattern? |
I think that -eq and -ne match the given string and don't support
wildcards.
Only -like supports wildcards for pattern matching.
You can however use a regular expression with the -notmatch switch to
achieve what you want. Since it's a regular expression now you need to use
.* instead of *. And the beginning is marked with ^.
So you end with this
{$_.Name -notmatch "^reports.*|^category.*"}
The whole command
Get-Item -Path ($AppDir + "reports*.dbf") | Where-Object {$_.Name -notmatch
"^reports.*|^category.*"}
|
Deleting files from S3 based on pattern matching? |
Based on my solution, I am just posting the code so that it might be useful
for some one looking for same scenario :
/**
* Delete keys/objects from buckets with matching prefix
* @param bucket
* Bucket in which delete operation is performed
* @param prefix
* String to match the pattern on keys.
* @return
*/
@Override
public void deleteFilesInS3(String bucket, String prefix) throws
IOException {
try{
List<KeyVersion> keys = listAllKeysWithPrefix(bucket,
prefix);
DeleteObjectsRequest multiObjectDeleteRequest = new
DeleteObjectsRequest(bucket);
multiObjectDeleteRequest.setKeys(keys);
s3EncryptionClient.deleteObjects(multiObjectDeleteRequest);
}catch(MultiObjectDeleteException e){
|
Probing Path - How to Force Calls to Remain Within a Directory Structure |
Further investigation has revealed that this was not a reflection or
assembly loading problem. The problems I was experiencing was due to build
issues inside the assemblies I was reflecting on. The errors were obscure
and it made the problem look like something it wasn't. Thanks to SWeko for
noting to look into the inner exceptions. That comment helped a lot!!
|
List files matching pattern when too many for bash globbing |
This is where find in combination with xargs will help.
find /path/to/files -name "pattern*" -print0 | xargs -0 ls
Note from comments: xargs will help if you wish to do with the list once
you have obtained it from find. If you only intend to list the files, then
find should suffice. However, if you wish to copy, delete or perform any
action on the list, then using xargs instead of -exec will help.
|
Maven : copy directory structure without the files |
You can use the maven-resource-plugin for such purposes like this:
<project>
...
<build>
<plugins>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>copy-resources</id>
<!-- here the phase you need -->
<phase>validate</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${basedir}/target/extra-resources</outputDirectory>
<resources>
<resource>
<directo
|
How to grep files under a pattern path |
From the UNIX philosophy:
Write programs that do one thing and do it well.
Write programs to work together.
I don't like the GNU extension for recursive directory searching. The tool
find does a much better job, has a cleaner syntax with more options and
doesn't break the philosophy!
$ find foo/*/VIEW -name "*.groovy" -exec grep method {} ;
|
Fastest call to count matching files in a Windows directory? |
If this is really an issue ( with tens of thousands of filenames to search,
and thousands of different searches to be done ) then the simplest (
without writing huge amounts of code ) might be to cache the filenames in a
indexed database table and then use SQL to do the count. After the work of
copying the filenames to the database and indexing the table, any decent
SQL engine should do an excellent job on each individual search - I would
use SQLite myself.
The snag will be keeping the cache up to date. How much this is a pain
will depend on exactly what your attempting to do.
|
How to jar files in maven whose directory structure is not similar to sourceDirectory? |
Maven uses different folders for source and other files (so called
ressources). From srcmainjava it compiles all .java files and packages the
resulting .class files into the jar. All other files in that directory are
getting ignored to my knowledge.
Now if you want other files to be packaged too, put them in
srcmain
esources instead.
EDIT
To manually specify a resource directory add
<resources>
<resource>
<directory>D:Windchill_MarketERP
ConnectorSourceCodeBuildsrc1srcmain
esource</directory>
</resource>
</resources>
to your <build> section
|
change inheritance to composition for existing jaxb class structure keeping the current xml structure unchanged |
try this
@XmlAccessorType(XmlAccessType.NONE)
public class B {
@XmlElement
private String aString;// you don't need to create the class A (aString is
like bString lement)
@XmlElement
private String bString;
|
Rsync and ssh on android: No such file or directory |
I'm not sure but maybe problem is that the destination path
(rajeesh@10.0.2.2:backup/) is not absolute?
Also if you what to sync your files in the same device, maybe you should
try to not use ssh? And do something like that:
rsync -rvz /mnt/sdcard/some_directory /backup
|
How to configure grunt-contrib-uglify to minify files while retaining directory structure |
Set the flatten property to false.
There is a clear explanation on the grunt copy github readme
https://github.com/gruntjs/grunt-contrib-copy
Excerpt:
$ grunt copy
Running "copy:main" (copy) task
Created 1 directories, copied 1 files
Done, without errors.
$ tree -I node_modules
.
├── Gruntfile.js
├── dest
│ └── src
│ ├── a
│ └── subdir
└── src
├── a
└── subdir
└── b
5 directories, 4 files
Flattening the filepath output:
copy: {
main: {
expand: true,
cwd: 'src/',
src: '**',
dest: 'dest/',
flatten: true,
filter: 'isFile',
},
},
$ grunt copy
Running "copy:main" (copy) task
Copied 2 files
Done, without errors.
$ tree -I node_modules
.
├── Gruntfile.js
├──
|
How to get relative path of files from defined root directory? |
That would be os.path.relpath():
>>> import os
>>> filename = "C:/test/media/blog_images/2013/06/15/yay.gif"
>>> blog_images = "C:/test/media/blog_images/"
>>> os.path.relpath(filename, blog_images)
'2013\06\15\yay.gif'
|
Get Image Path from Images Directory in Supporting Files |
This should do the trick:
[UIImage imageNamed:@"someImageName"];
EDIT:
Some additional information:
-imageNamed: will look through the entire main bundle of the application
for an imagefile (preferrably an png) with the filename of "someImageName".
You need not worry about its location or its extension, since it will be
searched for in the mainbundle. Files that you import through the
import-file-dialogue in xcode will be added to he main bundle.
This means:
If i have imported a file called myImage.png, calling [UIImage
imageNamed:@"myImage"];from anywhere in my code will get me a
UIImage-Object containing that image. Its amazingly simple, and maybe that
startled you a bit ;)
Look it up in the docs if you like:
http://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIIm
|
Directory Getfiles retrieve wrong files path |
From the documentation:
Use this property to construct a URL relative to the application root
from a page or Web user control that is not in the root directory.
So ApplicationPath gives you relative path, and you're treating it as an
absolute path.
With your current approach you could be interested with
HttpRequest.PhysicalApplicationPath, but I'm not sure if this is sound.
Also, consider using Path.Combine instead of concatenating strings
yourself.
|
Copy-item Files in Folders and subfolders in the same directory structure of source server using Powershell |
This can be done just using Copy-Item. No need to use Get-Childitem. I
think you are just overthinking it.
Copy-Item -Path C:MyFolder -Destination \ServerMyFolder -recurse -Force
I just tested it and it worked for me.
|
How to modify this rsync command to find out the directory with '.' in Python? |
Personally I wouldn't bother using grep, I'd simply use Python's own string
filtering - however, that wasn't the question you asked.
Since the filenames are remote and Python sees them as simply strings then
we can't use any of Python's own file manipulation routines (e.g.
os.path.isdir()). So, I think you have three basic approaches:
Split each string by slashes and use this to build your own representation
of the filesystem tree in memory. Then, do a pass through the tree and only
display leaf nodes (i.e. files).
If you can assume that files within a directory are always listed
immediately after that directory, then you can do a quick check against the
previous entries to see if this entry is a file within one of those
directories.
Use meta-information from rsync.
I would suggest th
|
Matching each term in a text file against all text files in a directory |
You can do this with awk, but there is simple grep solution:
grep -f terms.txt directory/*xml -o
|
How to structure projects with ASP MVC using Repository Pattern, Service Pattern, UnitOfWork, ORM (EF, NHibernate etc..) |
It depends on the complexity of your project, but the general approach is
to put the abstractions in its own library, then reference it from your web
project. The implementation classes you can place again in its own library.
So the web project would not have a direct dependency on them
|
Linux: Move log files from a mounted dir to a local dir recursively while maintaining directory structure in local direcoty |
How about something like
rsync -azrR --include='*.log' -f 'hide,! */' /media/primary /backup
If you use man rsync and type 2343g it should take you to the line where it
explains this filter. Basically it will hide everything that's not a
directory (every file) from the pattern matching; however, since I've used
the --include='*.log' flag as well it will override that and the pattern
will match only .log files. You can also use the -nv flag to do a dry run
and see what would happen.
|
How exactly does rsync work? Is it smart enough not to transfer anything if your files aren't different? |
Directly from the man page (http://linux.die.net/man/1/rsync):
"Rsync finds files that need to be transferred using a lqquick checkrq
algorithm (by default) that looks for files that have changed in size or in
last-modified time. Any changes in the other preserved attributes (as
requested by options) are made on the destination file directly when the
quick check indicates that the file's data does not need to be updated."
|
Sometimes Rsync fails to copy files from the remote machine |
For validation:
If you want to check if all files were copied, you can do a simple glob or
list for list of files in the remote dir. Then compare to the glob list of
files in the target dir.
set globResult [ exec rsh -l $user $host "cd $dir; ls -l | wc -l" ]
Or if you go further, you can always checksum on both the remote and the
target directory.
As for the reason for failure, it could've been network or server issues.
While you are doing rsync, have you check the return exit status to make
sure rsync has completed successfully?
|
how can I copy the files in path.txt to new folder with same folder structure? |
You dont't need to read a list from a file. You could simply try this in
command prompt:
xcopy "full path to your folder" "full path destination folder" /e
Below is what you get if you type xcopy /? in command prompt:
Copies files and directory trees.
XCOPY source [destination] [/A | /M] [/D[:date]] [/P] [/S [/E]] [/V] [/W]
[/C] [/I] [/Q] [/F] [/L] [/G] [/H] [/R] [/T]
[/U]
[/K] [/N] [/O] [/X] [/Y] [/-Y] [/Z]
[/EXCLUDE:file1[+file2][+file3]...]
source Specifies the file(s) to copy.
destination Specifies the location and/or name of new files.
/A Copies only files with the archive attribute set,
doesn't change the attribute.
/M Copies only files with the arc
|
Set thrift output directory without Java style directory structure |
You can customize this in the thrift file by using the namespace function:
namespace java com.example.plot.gen
namespace rb foo
Now for files for Ruby will be in:
- (out directory)
-foo
- my generated files
|
Using batch files to copy all files from a directory tree to a single directory |
@echo off
set "COPY_FROM=C:UsersmeDesktopDisc 1"
set "COPY_TO=C:Testing est"
md "%copy_to%" 2>nul
cd /d "%COPY_FROM%"
for /f "delims=" %%a in ('dir /b /s /a-d') do copy "%%a" "%COPY_TO%"
pause
|
Gallery Is Missing Files From Directory- How To Retrieve All Files From Directory |
An alternative approach you may want to use to secure your list of
thumbnails:
$directory = 'thumb'; //where the gallery thumbnail images are located
$files = glob($directory."/*.{jpg,jpeg,gif,png}", GLOB_BRACE);
natsort($files); //sort by filename
Then to render it out simply do this:
<?php
for($x = 0; $x < count($files); $x++):
$thumb = $files[$x];
$file = basename($thumb);
$nomargin = $x%4 == 0?" nomargin":"";
$title = htmlspecialchars($file);
?>
<div class="thumbs fancybox<?= $nomargin ?>"
style="background:url('<?= $thumb ?>') no-repeat 50% 50%;">
<a rel="group" href="images/'.<?= $file ?>" title="<?=
$title ?>"><?= $title ?></a>
</div>
<?php
endfor;
|
Compare files and write matching and not matching things |
I would use a hash to remember what ID corresponds to what name.
#!/usr/bin/perl
use warnings;
use strict;
my $table = 'file_2.txt';
my $column = 'file1.txt';
my %names;
open my $TAB, '<', $table or die $!;
while (<$TAB>) {
my ($name, @ids) = split;
push @{ $names{$_} }, $name for @ids;
}
open my $COL, '<', $column or die $!;
while (<$COL>) {
chomp;
print @{ $names{$_} // [$_] }, "
";
}
|
Keeping code structure with string literal that uses whitespace |
if (true)
{
if (!false)
{
//Some indented code;
stringLiteral = string.format(
"This is a really long string literal. " +
"I don't want it to have whitespace at " +
"the beginning of each line, so I have " +
"to break the indentation of my program " +
"I also have vars here: " +
"{0} " +
"{1} " +
"{2}",
var1, var2, var3);
// OR, with lineskips:
stringLiteral = string.format(
"This is a really long string literal
" +
"I don't want it to have whitespace at
" +
"the beginning of each line, so I have
" +
|
Django under IIS (with HeliconZoo) in virtual directory - urls resolver works with path without directory name |
Finally, I've found what's wrong with it.
Appeared that the reason for described behavior is a setting provided with
default web.config file:
<add name="django.root" value="%APPL_VIRTUAL_PATH%" />
After I removed it everything started working fine.
Here is some info related to django.root variable applied to Apache.
|
Directory.Delete(path, true) throws IOException "The directory is not empty" |
It appears that info.EnumerateFiles was the issue. I got that idea from
this answer.
I switched that to info.GetFiles and I was then able to delete the
directory after.
|
Python joining current directory and parent directory with os.path.join |
You can use normpath, realpath or abspath:
import os
goal_dir = os.path.join(os.getcwd(), "../../my_dir")
print goal_dir # prints C:/here/I/am/../../my_dir
print os.path.normpath(goal_dir) # prints C:/here/my_dir
print os.path.realpath(goal_dir) # prints C:/here/my_dir
print os.path.abspath(goal_dir) # prints C:/here/my_dir
|
How to resolve a relative directory path to canonical path in MATLAB/Octave |
The simplest way I know to convert a path to its canonical form is using cd
command:
oPath = cd(cd(iPath));
Note that this would fail if the path does not exist on your file system.
|
change structure of url to directory-like structure |
Add this rules to your .htaccess file in Root folder :
To do this www.linku.biz/JackTrow <---
www.linku.biz/profile.php?us="JackTrow
RewriteRule ^profile.php?us=(.*)$ /$1 [R=301,L]
To do this : www.linku.biz/search however <--- www.linku.biz/search.php
RewriteRule ^(.*).php$ /$1 [R=301,L]
Your last one , i have not understood what you want exactly.
|