w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
How to insert an inline (heredoc maybe? ) python script into a bash stdin/stdout streaming pipeline
It would be better if You explain what is your goal with this construction. Maybe it could be simplified. The problem is with this script that the echo goes to the stdin of the encapsulating shell initiated by the (...) notation. But inside the shell stdin is redefined as the heredoc piped to python, so it reads the script from stdin, which is now comes from the heredoc pipe. So you try something like this: echo -e "Line One Line Two Line Three" | python <(cat <<HERE import sys print "stdout hi" for line in sys.stdin: print line.rstrip() print "stdout hi" HERE ) Output: stdout hi Line One Line Two Line Three stdout hi Now the script is read from /dev/fd/<filehandle>, so stdin can be used by the echo's pipe. SOLUTION #2 There is another solution. The script can b

Categories : Python

How to echo stdin to stdout
You need to flush the stdout: int main() { while (1) { int ch = getc(stdin); fflush(stdout); if(ch == EOF) break; putc(ch, stdout); } return 0; }

Categories : C

Handling stdin and stdout
This line: print p.stdout.read() # expected this to print output interactively. This actually hungs. hangs because read() means "read all data until EOF". See the documentation. It seems like you may have wanted to read a line at a time: print p.stdout.readline()

Categories : Python

Paramiko not outputting stdout
You have closed the connection before reading lines: import paramiko client=paramiko.SSHClient() client.set_missing_host_key_policy(paramiko.AutoAddPolicy()) com="ls ~/desktop" client.connect('MyIPAddress',MyPortNumber, username='username', password='password') output="" stdin, stdout, stderr = client.exec_command(com) print "ssh succuessful. Closing connection" stdout=stdout.readlines() client.close() print "Connection closed" print stdout print com for line in stdout: output=output+line if output!="": print output else: print "There was no output for this command"

Categories : Python

Python Threading stdin/stdout
Taking your second question first, this is what mutexes are for. You can get the cleaner output that you want by using a lock to coordinate among the parsers and ensure that only one thread has access to the output stream during a given period of time: class parser(threading.Thread): output_lock = threading.Lock() def __init__ (self, data_input): threading.Thread.__init__(self) self.data_input = data_input def run(self): for elem in self.data_input: time.sleep(3) with self.output_lock: print elem + 'Finished' As regards your first question, note that it's probably the case that multi-threading will provide no benefit for your particular workload. It largely depends on whether the work you do with each input

Categories : Python

replacing stdout and stdin with a file pointer in c++?
If looks like the structure member is expecting a file id number rather than a pointer to a streamed file buffer. Use fileno to acquire the id from FILE*. #include <stdio.h> FILE * fp; fp=fopen("myf","r"); msb.sbox.task.ofd=fileno(fp);

Categories : C++

Is stdin, stdout, stderr buffered or unbuffered in Lua?
See setvbuf. It is an interface to the underlying C setvbuf function. For example you can use it like this: io.stdout:setvbuf 'no' -- switch off buffering for stdout AFAIK Lua relies on the underlying C runtime to hook into standard streams, therefore I think the usual guarantees for C standard streams apply.

Categories : Lua

printing stdout in realtime from a subprocess that requires stdin
something like this I think from subprocess import Popen, PIPE, STDOUT p = Popen('c:/python26/python printingTest.py', stdout = PIPE, stderr = PIPE) for line in iter(p.stdout.readline, ''): print line p.stdout.close() using an iterator will return live results basically .. in order to send input to stdin you would need something like other_input = "some extra input stuff" with open("to_input.txt","w") as f: f.write(other_input) p = Popen('c:/python26/python printingTest.py < some_input_redirection_thing', stdin = open("to_input.txt"), stdout = PIPE, stderr = PIPE) this would be similar to the linux shell command of %prompt%> some_file.o < cat to_input.txt see alps answer for better passing to stdin

Categories : Python

how to run an application in c++ and write to its stdin and read from its stdout in windows
You can spawn a child process and then gain access to its stdin and stdout pipes. You have to use WinAPI to achieve this. See example here: http://msdn.microsoft.com/en-us/library/ms682499(v=vs.85).aspx You may use Qt and it's QProcess class to read/write child process output/input.

Categories : C++

Redirect child process's stdin and stdout to pipes
Your code is quite long so I'm not sure that I have understand everything but why you don't use select ? Do you want to redirect the output of the child in a tird process or use it in your parent process ? The following exemple is with cat in the child process. #include <unistd.h> #include <stdlib.h> int main() { pid_t pid; int p[2]; pipe(p); pid = fork(); if (pid == 0) { dup2(p[1], 1); // redirect the output (STDOUT to the pipe) close(p[0]); execlp("cat", "cat", NULL); exit(EXIT_FAILURE); } else { close(p[1]); fd_set rfds; char buffer[10] = {0}; while (1) { FD_ZERO(&rfds); FD_SET(p[0], &rfds); select(p[0] + 1, &rfds, NULL, NULL, NULL); //

Categories : C

Redirect stdin to stdout while waiting for background process
As mentioned, you can't trap SIGKILL (or SIGSTOP). Other than that, try this: #!/bin/bash cmd cmd_args <&0 & wait_pid=$! trap "kill -s SIGTERM $wait_pid" SIGTERM SIGINT wait $wait_pid The <&0 will tell bash that you want cmd's stdin to be the script's stdin. Slightly off topic, but another interesting method of dumping bytes into a running process' stdin is to send them directly to it's /proc file descriptor, as in: echo "Stuff to send to the process" >/proc/$wait_pid/fd/0

Categories : Bash

redirecting stdin/stdout from exec'ed process to pipe in Perl
It's not possible to redirect file descriptors just with assignments. Rather one needs to use open like described in perldoc -f open. In your case the child code would look like this: print "I am the child. My pid = $$ " ; close( READER ) ; open STDOUT, ">&", *WRITER or die $!; open STDERR, ">&", *WRITER or die $!; print WRITER "XXX ouput before exec.... " ; exec( $cmd ) or exit(1) ;

Categories : Perl

ruby - IO.popen not working lame stdin and stdout encoding
Did you try the pipe | character? Tested this on windows with ruby installer require 'open3' command = 'dir /B | sort /R' # a windows example command Open3.popen3(command) {|stdin, stdout, stderr, wait_thr| pid = wait_thr.pid puts stdout.read #<a list of files in cwd in reverse order> } Other ways: Ruby pipes: How do I tie the output of two subprocesses together? EDIT: using IO::pipe require 'open3' command1 = 'dir /B' command2 = 'sort /R' reader,writer = IO.pipe Open3.popen3(command1) {|stdin, stdout, stderr, wait_thr| writer.write stdout.read } writer.close stdout, stderr, status = Open3.capture3(command2, :stdin_data => reader.read) reader.close puts "status: #{status}" #pid and exit code puts "stderr: #{stderr}" #use this to debug command2 errors puts

Categories : Ruby

Access STDIN of child process without capturing STDOUT or STDERR
If you’re on a platform that supports it, you could do this with pipe, fork and exec: # create a pipe read_io, write_io = IO.pipe child = fork do # in child # close the write end of the pipe write_io.close # change our stdin to be the read end of the pipe STDIN.reopen(read_io) # exec the desired command which will keep the stdin just set exec 'the_child_process_command' end # in parent # close read end of pipe read_io.close # write what we want to the pipe, it will be sent to childs stdin write_io.write "this will go to child processes stdin" write_io.close Process.wait child

Categories : Ruby

How does (x)inetd "split" its connected socket into STDIN/STDOUT/STDERR?
Incoming data to the inet.d server will be mapped to stdin, and fprintf(stdout) & fprintf(stderr) of inet.d handler will be send back to client by combining into one stream. Here the issue is telnet client can't distinguish the stdout and stderr streams. If needed use ssh instead of telnet, that provides the option.

Categories : Sockets

pipe child process stdout & stdin to browser in node.js & browserify
You should look into hyperwatch, which pipes the server side stdout/stderr to the browser and renders it exactly like it would appear in your terminal (including colors). If it doesn't exactly solve your problem, reading through the code should at least help you. It uses hypernal under the hood in order to convert terminal output to html.

Categories : Node Js

How to code a simple php5-fastcgi stdin stdout one time shot for now
To work with stdin / stdout you have to deal with them like files using the php:// wrappers $stdin = fopen("php://stdin", "r"); and $strout = fopen("php://stdout", "w"); Then you can use the normal file function (fgets, fread, fputs and fwrite etc...) to manipulate the values.

Categories : PHP

Bash: Search a script for a block of commands and then execute those commands
You don't have to find specific files when you unbundle. The if..then takes care of that. Make the file bundle be a set of blocks like this: if [[ $# = 0 ]] || contains "file1.txt" "$@" then cat > file1.txt << 'End of file1.txt' DATA HERE End of file1.txt fi where contains is a function that checks for the first element amongst the rest, e.g. contains() { var=$1 shift for f do [[ $var = "$f" ]] && return 0 done return 1 } Each file will then only be unbundled if either there are no arguments, or if the filename is amongst them. You can add additional logic in the header to ensure all the filenames specified exist in the file before you start running these blocks.

Categories : Bash

Is it possible to modify stdin for some daily commands?
Can't you just use zsh -f ack() { command ack "$*" } ack hey there or something along these lines? [...] Actually I was curious if I could write something along what you described. I have to reckon that my zsh-fu is not enough for it. To muck with the cursor position, you would normally change the CURSOR value (see man zshzle). From within zle widgets you would access special values by using zmodload -i zsh/parameter (see man zshmodules). But the trick is that -as stated in the zshzle man page): Inside completion widgets and traps called while ZLE is active, these parameters are available read-only. So we can't change the value of CURSOR from inside a completion widget.

Categories : Shell

Run script commands from stdin, with arguments
Use sh -s: echo "$nested_script" | sh -s "foo" "bar" You could also have passed the script as a parameter with -c: sh -c "$nested_script" -- "foo" "bar" This frees up the script's stdin.

Categories : Shell

execute user inputed Windows (or bash) commands from batch (or bash) file?
Do you need a full bash prompt? Or would something like this be enough? #!/bin/bash echo -n "Enter cmd: " read COMMAND echo ${COMMAND} | bash Also, in a script, you can just execute bash and get a full prompt in the current environment.

Categories : Bash

subprocess or commands.getstatusoutput in STDOUT AND STORE at variable
To print output line-by-line as soon as child processes flushes its stdout and to store it in a variable: from subprocess import Popen, PIPE buf = [] proc = Popen([cmd], stdout=PIPE, bufsize=1) for line in iter(proc.stdout.readline, b''): buf.append(line) print line, proc.communicate() # close `proc.stdout`; wait for the child process to exit output = b"".join(buf) There could be a buffering issue (the output appears with a delay); to fix it, you could use pexpect, pty modules or stdbuf, unbuffer, script commands.

Categories : Python

capturing STDERR from commands and pipe STDOUT in perl under windows
That's because '>' doesn't like to share files. Give each stage of the pipeline its own error log, and then execute something like this after the pipeline finishes: system("cat error1.log erorr2.log error3.log > error.log"); Here's a platform independent way to aggregate the logs: my @error_logs = qw( error1.log error2.log error3.log ); open my $errlog, ">>", "error.log" || die "probelm opening error log: $!"; foreach my $sublog ( @error_logs ) { open my $fh, "<", $sublog || die "oh boy: $sublog: $!"; print "$sublog:" print $errlog while $fh; close $fh; } close $errlog; But there also exist IO::Cat and File::Cat if you decide to lean that way. 1)Corrected the name of the selfish meany that will not share files. 2) added log file collection

Categories : Windows

nodejs - Bash-like stdin
The solution is the readline module (doc). It provides an input history, an autocompletion (on tab hit), etc. Example : var readline = require('readline'); var rl = readline.createInterface({ input : process.stdin, output : process.stdout, }); function input (prompt, callback) { rl.question(prompt, function (res) { if (res === 'quit') rl.close(); else callback(res); }); } An example of Command Line Interface is given here in the doc

Categories : Node Js

system.stdout and system.stdin is undefined in casperjs
Which version of PhantomJS do you use? Support for standard I/O was introduced in the 1.9 version. Read more here: http://phantomjs.org/release-1.9.html

Categories : Javascript

Pass bash array to stdin
Strictly speaking, you would want something like for line in "$@"; do echo "$line" done | prog It's not a here document, but it has the same effect. Here documents and arrays were developed for two different use cases. Even more strictly speaking, $@ is not an array, although it tries very hard to behave like one. :)

Categories : Bash

Two processes reading the same stdin
It's not working but it can be a beginning of solution void send_command(int *p) { pid_t pid; pid = fork(); // check -1 if (pid == 0) { int i = 0; int h; int ret; char buffer[128] = {0}; dup2(p[0], 0); while (i < 2) { if ((ret = scanf("%s ", buffer))) { //get your value from the buffer i++; } } printf("done "); exit(1); } } in the child process you are going to read everything from the input and then find the value you need inside. int main() { char opt; int p[2]; pipe(p); while(1) { scanf(" %c", &opt); write(p[1], &opt, 1); write(p[1], " ", 1); switch(opt)

Categories : C

Redirect to stdout in bash
If I have understood your requirement clearly, the following should do what you want exec 2>&1 exec >> $LOG Stdout and stderr of all subsequent commands will be appended to the file $LOG.

Categories : Bash

How can I take STDIN and use it on the bash shell to set an environment variable?
export PATH=$(echo "$PATH" | sed -e "s|^/[A-Za-z/]*:||") export is a shell built-in; that's why you can't execute it directly via xargs (there isn't an executable for export). This runs your edit script and sets the value as the new value of $PATH. As written with double quotes around $PATH, it works even there are multiple adjacent spaces in an element of your PATH (fairly unlikely, but no harm in making sure it works properly).

Categories : Bash

Use alarm to set a timeout for reading stdin
What OS are you running this on? What version of perl? Works fine for me on Mac OS X 10.8.3 with perl 5.12.4. If you're using perl on Windows, you'll find that signals don't work the same as on POSIX and POSIX-like operating systems, and you might need to use the 4-argument version of select() instead.

Categories : Perl

delays in reading stdin with codecs
There are two levels of buffering in this seemingly simple example. To avoid the first level ---and more as a workaround than a solution--- you can read each line and then decode it, rather than the other way around. It works because end-of-line is still unambigously in utf-8. (Note: this first piece of code doesn't work because it still has the 2nd level of buffering! It is included for explanation purposes) for l in sys.stdin: l = l.decode('utf-8', 'replace') print l The second level comes from for l in file. So you need actually: while True: l = sys.stdin.readline() if not l: break l = l.decode('utf-8', 'replace') print l

Categories : Python

Reading from stdin and storing and whitespace
scanf() splits the input at whitespace boundaries, so it's not suitable in your case. Indeed fgets() is the better choice. What you need to do is keep reading after fgets() returns; each call will read a line of input. You can keep reading until fgets() returns NULL, which means that nothing more can be read. You can also use fgetc() instead if you prefer getting input character by character. It will return EOF when nothing more can be read.

Categories : C

Is there an fread analog for reading from stdin?
All of the read.* functions use 'scan' under their hoods. scan is fairly low level but does have the capacity for parsing lines of data into different classes. > mat <- matrix(scan(), 4,4) # will paste in block of data 1: 0.5 0.1428571 0.25 4: 0.5 0.1428571 0.25 7: 0.5 0.1428571 0.25 10: 0.5 0.1428571 0.25 13: 0.5 0.1428571 0.25 16: 0.5 17: # Terminate with two <cr>'s Read 16 items > mat [,1] [,2] [,3] [,4] [1,] 0.5000000 0.1428571 0.2500000 0.5000000 [2,] 0.1428571 0.2500000 0.5000000 0.1428571 [3,] 0.2500000 0.5000000 0.1428571 0.2500000 [4,] 0.5000000 0.1428571 0.2500000 0.5000000 > lst <- scan(what=list(double(0), "a")) 1: 4 t 2: 6 h 3: 8 l 4: 8 8 5: Read 4 records > lst [[1]] [1] 4 6 8 8 [[2]] [1] "t" "h" "l" "8" You sh

Categories : Linux

Outputting multiple files using XPath in bash
Something like this might work: #!/bin/bash for f in *.xml; do fid=$(xpath -e '//fileId/text()' "$f" 2>/dev/null) for uid in $(xpath -e '//otherFile/@href' "$f" 2>/dev/null | awk -F= '{gsub(/"/,"",$0); print $3}'); do echo "Moving $f to ${fid}_${uid}.xml" cp "$f" "${fid}_${uid}.xml" done rm "$f" done

Categories : Xml

stop bash script from outputting in terminal
Append >> /path/to/outputfile/outputfile.txt to the end of every echo statement echo "Process is running." >> /path/to/outputfile/outputfile.txt Alternatively, send the output to the file when you run the script from the shell [~]# sh check.sh >> /path/to/outputfile/outputfile.txt

Categories : Bash

Loop outputting blank variables - SSH Bash
Try to quote your variable $file as it is expanded before being sent to the remote server: echo $file To echo $file Or better yet echo "$file"

Categories : Bash

Bash script not outputting nohup.out + Jenkins
You should make sure that the output goes into your build's workspace. This will avoid permission problems with other directories. nohup otherScript.sh > $WORKSPACE/scriptOutput.txt 2>&1 &

Categories : Bash

Bash: Sum of integer values from stdout
Let's do it with awk? $ awk 'a+=$1; END{print a}' file 14 bytes long. 36 bytes long. 32 bytes long. 82 With bash: f=0 while read i do n=$(echo $i | cut -d' ' -f1) tot=$(($n + $tot)) done < file $ echo $tot 82

Categories : Bash

Flush/Clear System.in (stdin) before reading
Devices usually send data using a well defined protocol which you can use to parse data segments. If I'm right, discard data that isn't properly formatted for the protocol. This allows you to filter out the data you aren't interested in. As I'm not familiar with the RFID scanner you're using I can't be of more help, but this is what I suggest.

Categories : Java

Reading Stdin in chunks... (possibly with scanf?)
The problem is that on the second iteration the scanf can't read the format you gave it (the line read from standard input does not match) and doesn't modify proc. That's also the reason it returns 0: it has successfully read (and thus modified) 0 fields.

Categories : C



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.