Processing output of python file called inside a bash script |
If you mean the output of python, you should test it with $() instead
#!/bin/bash
if test "$(python /var/lib/scripts/Hudson.py result)" = "Success"
then
Run next command
else
Exit the script
fi
And it's actually better with [[ ]]
#!/bin/bash
if [[ "$(python /var/lib/scripts/Hudson.py result)" == "Success" ]]
then
Run next command
else
Exit the script
fi
If you mean the exit code:
#!/bin/bash
if python /var/lib/scripts/Hudson.py result
then
Run next command
else
Exit the script
fi
|
How to paste many lines to a file using a bash script? |
I would use cat together with here-doc syntax for this:
cat <<EOF > /etc/nsswitch.conf
group: compat
shadow: compat
hosts: files mdns4_minimal [NOTFOUND=return] dns mdns4
...
EOF
The statement above will overwrite or create the file with the contents
between the first line and EOF. In the form above even variables like
group: $group would be expanded by bash. If you don't want this, then use
<<'EOF' (note the single quotes ' around the EOF)
|
Read Bash Variables into a Python Script |
You need to export the variables in bash, or they will be local to bash:
export test1
Then, in python
import os
print os.environ["test1"]
|
Source in shell script doesn't work when script called with arguments |
With your script, you are automating the following commands, if xilinx ise
is typed in from the command line:
$ cd /home/sclukey/Xilinx
$ source /opt/Xilinx/14.6/ISE_DS/settings32.sh
$ ise
The response from the script indicates that there's no ise program in the
path. I would check to see where ise is, and if settings32.sh sets up a
path for it.
|
Read txt file and parse the values to bash script |
The following would give you a list of packages for which you want the
reports:
grep "^packages" config.txt | cut -d= -f2 | tr ',' ' '
Based on this, you can loop for values in the list:
filename="config.txt"
for i in $(grep "^packages" $filename | cut -d= -f2 | tr ',' ' '); do
for file in $(find /home/user/ftpuser -maxdepth 1 -name "*.[ew]ar" -type
f); do
echo /apps/oracle/jrockit/4.1.0-1.6.0_37-R28.2.5-x86_64/bin/java -jar
../windup-cli-0.6.8/windup-cli.jar -javaPkgs ${i} - input ../ftpuser/
-output ../reports/ "${file}"
cp "${file}" /home/user/ftpuser/scanned/
done
done
|
how to pass variable from python to shell script when shell script is getting called inside python script |
You can:
Step through the list, and then call mail.sh once for each file. This is
the most reliable way.
Convert the items in the list to a space delimited string, and then pass it
in to mail.sh (assuming mail.sh is setup correctly, it should accept
multiple arguments).
What you cannot do is pass ['a.txt','b.txt','c.txt'] wholesale to the
script.
|
dump files from bash script in different directory from where python script ran it |
You should change directory within the same command:
cmd = "/path/to/executable/executable"
outputdir = "/path/to/output/"
subprocess.call("cd {} && {}".format(outputdir, cmd), shell=True)
|
Is it possible to have bash script output multiple lines over the same lines without 'clear'? |
That's how you can do it:
while true; do
date
sensors | grep "temp1"
sensors | grep "Core"
acpi
sleep 1
for i in {1..4}; do # clear four lines above
tput cuu1 # up by one line
tput el # clear that line
done
done
Use man tput for more info. To see the list of capabilities use man
terminfo
Edit:
Here is a hack that I came up with to avoid blinking:
while true; do
echo -n "$(date)"; tput el; echo
echo -n "$(sensors | grep "temp1")"; tput el; echo
echo -n "$(sensors | grep "Core")"; tput el; echo
echo -n "$(acpi)"; tput el; echo
sleep 1
tput cuu 4
# tput -S <<< $'cuu1
cuu1
cuu1
cuu1' # that's how you pass several actions to tput, but instaed of cuu1
several times use 'cuu N'
done
And of course
|
What do I do to make a python script that can run from any directory: the script file doesn’t have to be in the same directory as the .csv files? |
Assuming you mean to include a fixed CSV file with your code, store an
absolute path based on the script path:
HERE = os.path.dirname(os.path.abspath(__file__))
csv_filename = open(os.path.join(HERE, 'somefile.csv')
__file__ is the filename of the current module or script,
os.path.dirname(__file__) is the directory the module resides in. For
scripts, __file__ can be a relative pathname, so we use os.path.abspath()
to turn that into an absolute path.
This means you can run your script from anywhere.
If you meant to make your script work with arbitrary CSV input files, use
command line options:
import argparse
if __name__ == '__main__':
parser = argparse.ArgumentParser('CSV importer')
parser.add_argument('csvfile', type=argparse.FileType('w'),
default
|
Bash script to wait for gnome-terminal to finish before continuing script, only works for first instance of script |
#!/bin/bash
date
bash -c "sleep 7" &
bash -c "sleep 5" &
wait
date
As you can see while running this script, both sleep commands will run in
parallel, but main thread stalls, while they are running.
Sat. Jule 27 01:11:49 2013
Sat. Jule 27 01:11:56 2013
Replace sleep 7 with expect launchneuron.exp
and sleep 5 with expect launchmpj.exp
and add your plot commands after calling "wait":
echo "Simulation Complete"
...(your code to plot results)
|
How to write the results of a batch file called in a Python script to a file |
Why are you calling list2cmdline? This doesn't actually call the
subprocess.
Use subprocess.check_output instead:
import os
output = []
for _, _, files in os.walk(directory):
for f in files:
fullpath = os.path.join(directory, os.path.basename(f))
output.append(subprocess.check_output([fullpath]))
print '
'.join(output)
|
Bash script to tail -f with colored lines |
Do not quote the argument variable:
tail -f input | perl -pe 's/.*'$1'.*/e[1;31m$&e[0m/g'
You can also use grep for this:
tail -f input | grep -e $1 -e '' --color=always
and to color the whole line with grep:
tail -f input | grep -e ".*$1.*" -e '' --color=always
|
How to get a file on a memory stick read into a python script? |
Since you don't explicitly open the file yourself, the simplest thing to do
in this case would be to just make sure that the path to the file you pass
asciitable.read() is valid. Here's what I mean:
import asciitable
import os
from string import ascii_uppercase
import sys
PATH_TEMPLATE = '{}:/ECBGF/bg0809_protected.txt'
for drive in ascii_uppercase[:-24:-1]: # letters 'Z' down to 'D'
file_path = PATH_TEMPLATE.format(drive)
if os.path.exists(file_path):
break
else:
print 'error, file not found'
sys.exit(1)
x = asciitable.read(file_path, guess=False, delimiter=' ',
fill_values=[('', '-999')])
|
Cloning git in bash script called from php webpage |
So to pack up my comments in an answer:
The shell script is now run as apache, as git uses ssh, corresponding
config files are needed. Which were created in /var/www; apaches home
directory. Apache did not have write permissions in /var/www thus could not
create these files.
To resolve, create the /var/www/.ssh directory yourself and give www-data
(or whatever user apache runs under in your system) write access to that
folder.
Next, github requires you to authorize ssh keys. It is safer to create a
new one for apache in the newly created /var/www/.ssh directory and add
this key to your github keychain.
|
nmap slow when called from bash script |
In the script you're calling nmap -sn 168.1.1.0-255 rather than nmap -sn
192.168.1.0-255. 192.168.* is a private subnet which is understandably
quicker to scan than 168.1.1.*, a public IP address range out on the
Internet.
|
bash script read line by line and echo to file |
make some edits like this :
iname = checktest
while read line
do
if [ -z "$line" ]
then echo "" >> ${iname}2.txt
else
echo "$line" >> ${iname}2.txt
fi
done < ${iname} ;
it should work now , hope this helps ...
|
Redirect output of a .bat file run by a python script to GUI, and making that script into EXE |
You could use the subprocess module to call your script
Example calling the 'ls' command on Linux:
>>> from subprocess import call
>>> call(['ls', '-l'])
total 0
-rw-rw-r-- 1 user group 0 Jun 17 18:37 file1
-rw-rw-r-- 1 user group 0 Jun 17 18:37 file2
-rw-rw-r-- 1 user group 0 Jun 17 18:37 file3
0
|
Invoke python script from another python script and set execution directory of the executed script |
You could use cwd parameter, to run scriptB in its directory:
import os
from subprocess import check_call
check_call([scriptB], cwd=os.path.dirname(scriptB))
|
bash script that has sudo PW already in it so it doesn't prompt user |
Short version: No.
Longer version: sudo uses the users own password, so you cannot put it in
the script.
A better approach is to configure sudo so that people can run the script.
For example, if the script is /usr/local/bin/root_stuff.sh then put
something like
Cmnd_Alias ROOTCMD = /usr/local/bin/root_stuff.sh
%users ALL=ROOTCMD, NOPASSWD: ROOTCMD
Your users can then run sudo root_stuff.sh
Or if that is really beyond them, put the code in
/usr/local/bin/root_stuff_inner.sh (changing ROOTCMD above) and put this in
root_stuff.sh
#!/bin/sh
sudo /usr/local/bin/root_stuff.sh
|
PHP script to remove all lines of the file except last n lines |
Simply get the number of lines, and use that to know the line numbers for
the last two lines:
$lines_array = file("./home/userdata/log.ini");
$lines = count($lines_array);
$new_output = "";
for ($i=$lines-2; $i<$lines; $i++){
$new_output .= $lines_array[$i];
}
file_put_contents("./home/userdata/log.ini");
|
Piping Perl script output to head -n 10 kills script after printing 10 lines |
When the reading end of a pipe is closed, and the writing process tries to
write something to a pipe, then the writing process receives a SIGPIPE. The
pipe is called broken.
We can capture this event like
local $SIG{PIPE} = sub {
# This is our event handler.
warn "Broken pipe, will exit
";
exit 1;
};
This would gracefully exit your program. Instead of installing a sub as
event handler, you could give the string IGNORE. This would let your script
carry on as if nothing happened.
# print will now return false with $!{EPIPE} true instead of dying
local $SIG{PIPE} = 'IGNORE';
|
how to create a script from a perl script which will use bash features to copy a directory structure |
First, I see that you want to make a copy-script - because if you only need
to copy files, you can use:
system("cp -r /sourcepath /targetpath");
Second, if you need to copy subfolders, you can use -r switch, can't you?
|
bash shell script error works on command line, not in script |
Variables are supposed to contain data, and bash treats them as data. This
means that shell meta-characters like quotes are also treated as data.
See this article for a complete discussion on the topic.
The short answer is to use arrays instead:
ASCIIDOC_OPTS=( --asciidoc-opts='-a lang=en -v -b docbook -d book' )
DBLATEX_OPTS=( --dblatex-opts='-V -T db2latex' )
cmd=(a2x -v -f pdf -L "${ASCIIDOC_OPTS[@]}" "${DBLATEX_OPTS[@]}"
"$1".asciidoc)
# Print command in pastable format:
printf '%q ' "${cmd[@]}"
printf '
'
# Execute it
"${cmd[@]}"
Make sure not to use eval:
eval "$cmd" #noooo
This will appear to work with your code as you posted it, but has caveats
and security problems.
|
How to pass output from remote script to local script in bash |
On your local script, in your ssh line, you can redirect some of the
outputs to a file with tee:
ssh ... | tee -a output.log
If you want to filter which one goes to the output.log file, you can use
process substitution:
ssh .... | tee >(grep "Some things you want to filter." >>
output.log)
Besides grep you can use other commands as well like awk.
|
Create bash script with menu of choices that come from the output of another script |
This might work for you:
#!/bin/bash
# Set the prompt for the select command
PS3="Type a number or 'q' to quit: "
# Create a list of customer names and numbers (fill gaps with underscores)
keys=$(/usr/local/bin/info $1 | sed 's/ /_/g')
# Show a menu and ask for input.
select key in $keys; do
if [ -n "$key" ]; then
/usr/local/bin/extrainfo $(sed 's/.*_11111/11111/'
<<<"$key")
fi
break
done
|
adding lines in C code using Python script |
The basic idea for any code-transforming tool is simple:
Iterate through the source line by line (or token by token, or whatever's
appropriate—but given your sample, lines are fine). Copy the lines to a
new file, also adding in whatever new lines are added, and keeping track of
any information you'll need later.
Here's the skeleton to use:
rloop = re.compile(r'…')
rendloop = re.compile(r'…')
with open('old.c') as oldc, open('new.c', 'w') as newc:
loops = {}
for line in c:
mloop = rloop.match(line.strip())
if mloop:
loops[m.groups(1)] = m.groups()
newc.write(appropriate start-of-loop code)
newc.write(line)
mendloop = rendloop.match(line.strip())
if mendloop:
matching_start = loops[m.groups(1)]
|
BASH: setting an environment variable from within my script doesn't set it anywhere else - how can I export globally? Simple example within |
Instead you should try the following :
PROMPT>. ./init
Notice the extra dot . and the space between ./init. That is important. It
is similar to source init.
|
Bash: increment a variable from a script every time when I run that script |
A script is run in a subshell, which means its variables are forgotten once
the script ends and are not propagated to the parent shell which called it.
To run a command list in the current shell, you could either source the
script, or write a function. In such a script, plain
(( n++ ))
would work - but only when called from the same shell. If the script should
work from different shells, or even after switching the machine off and on
again, saving the value in a file is the simplest and best option. It might
be easier, though, to store the variable value in a different file, not the
script itself:
[[ -f saved_value ]] || echo 0 > saved_value
n=$(< saved_value)
echo $(( n + 1 )) > saved_value
Changing the script when it runs might have strange consequences,
especially when yo
|
Run PBS script and post-process output within bash script |
I don't believe PBSPro supports this, but TORQUE (another PBS derivative)
has a -x option that you might be interested in. You can submit a job like
this:
qsub -I -x <executable>
This would run your job interactively and run the executable, with all of
the output directed to your terminal, and the job will execute as soon as
that executable terminates. You could then begin post-processing at that
point. PBSPro may have similar functionality, but what I've described here
is for TORQUE.
|
how can i pass a variable from my php script and send it to my bash script |
Try using shell_exec in your php script to execute your shell script and
pass your variable, like so:
$cmd="perl -pi -e 's/ : /:/g' /opt/lampp/htdocs/" .
escapeshellarg($variable);
$r=shell_exec($cmd);
escapeshellarg is used to escape any potentially dangerous characters in
$variable, to prevent a command line injection attack.
|
BASH: How To Create a Setup Script to build another script |
This answer, a work in progress; Your code
while [ "$index" -lt "$elements" ]
do
?????????
echo "Your Directory is ~/$root/${gitdir[0]}/${gitdir[1]}/${colours[2]}"
done
becomes
fullPath="/${root}/"
index=1
while [ "$index" -lt "$elements" ] ; do
# append values from $gitdir until you are done
fullPath="${fullPath}/${gitdir[$index]}"
(( index++ ))
done
# not sure how colours got introduced to this but same idea
fullPath="${fullPath}/${colours[2]}"
echo "Your Directory is ~/${fullPath}"
use of (( index++ )) implies using a version of bash, ksh, zsh (maybe
others) that support arithmetic evaluations.
That said, it's not clear what your input into gitdir[@] will be and why
you need to "count" the levels. Why not just accept user input as
arguments, document the o
|
python script to remove reversed repeated lines |
You can use collections.OrderedDict here:
>>> from collections import OrderedDict
>>> dic = OrderedDict()
with open('file.txt') as f:
for line in f:
key = tuple(tuple(x.split(',')) for x in line.split())
rev_key = tuple(x[::-1] for x in key)
if key not in dic and rev_key not in dic:
dic[key] = line.strip()
...
>>> for v in dic.itervalues():
print v
...
1,2 3,4
5,6 7,8
5,6 8,7
|
python : access namespace of called script |
You can't access this. By the time you have arrived in the last line of
your script, the called script has finished executing. Therefore its
variables don't exist any more. You need to send this data to the calling
script in some other way (such as the called script printing it on the
standard output and the calling script getting it from there).
Even if it hadn't finished executing, I don't think you could access its
variables. In other words, your impression is wrong :-)
|
python doesn't run script with 'python |