w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML Categories
SocketServer Python
Here is very simple example of how it could work. You would start this and call it with this, for example: curl -i 'http://localhost:5001/foo/bar?foo=bar' -X POST -d '{"Foo":"Bar"}' HTTP/1.1 200 OK Some response% It is missing tons of things, but this should at least give you some sort of idea. import SocketServer class MyTCPHandler(SocketServer.BaseRequestHandler): def handle(self): self.data = self.request.recv(1024).strip() print self.data self.parse_request(self.data) func, args = self.path.split("/", 1) args = args.split("/") resp = getattr(self, func)(*args) self.request.sendall("HTTP/1.1 200 OK ") self.request.sendall(" ") self.request.sendall(resp) def parse_request(self, req): headers

Categories : Python

python socket and socketserver
You've got multiple problems here. The first is that you're printing bytes objects directly: print(name,"wrote:".format(self.client_address[0])) That's why you get b'Bob' wrote: instead of Bob wrote:. When you print a bytes object in Python 3, this is what happens. If you want to decode it to a string, you have to do that explicitly. You have code that does that all over the place. It's usually cleaner to use the decode and encode methods than the str and bytes constructors, and if you're already using format there are even nicer ways to deal with this, but sticking with your existing style: print(str(name, "utf-8"), "wrote:".format(self.client_address[0])) Next, I'm not sure why you're calling format on a string with no format parameters, or why you're mixing multi-argument pr

Categories : Python

Python SocketServer deadlock issue
# Do shutdown() if threaded, else execute server_close() if self.sysconf.threaded: self.socket.server.shutdown() else: self.socket.server.server_close() return (True,"") self.sysconf is the configuration for the daemon (conf in the code in the question), and self.socket is a reference to the stream handler.

Categories : Python

SocketServer used to control PiBot remotely (python)
The problem is that when Tkinter catches a key event, it triggers the more specific binding first (for example 'Key-Up'), and the event is never passed to the more general binding ('Key'). Therefore, when you press the 'up' key, KeyUp is called, but transmit is never called. One way to solve this would be to just call transmit() within all the callback functions (KeyUp, KeyDown, etc). For example, KeyUp would become def KeyUp(event): Drive = 'forward' drivelabel.set(Drive) labeldown.grid_remove() labelup.grid(row=2, column=2) transmit() Then you can get rid of the event binding to 'Key'. Another option would be to make "Drive" and "Steering" into Tkinter.StringVar objects, then bind to write events using "trace", like this: Drive = tk.StringVar() Drive.set('idle

Categories : Python

python http server, multiple simultaneous requests
I had same problem, but no tornado, no mysql. Do you have one database connection shared with all server? I created a multiprocessing.Pool. Each have its own db connection provided by init function. I wrap slow code in function and map it to Pool. So i have no shared variables and connections. Sleep not blocks other threads, but DB transaction may block threads. You need to setup Pool at top of your code. def spawn_pool(fishes=None): global pool from multiprocessing import Pool def init(): from storage import db #private connections db.connect() #connections stored in db-framework and will be global in each process pool = Pool(processes=fishes,initializer=init) if __name__ == "__main__": spawn_pool(8) from storage import db #shared connection for

Categories : Python

Uploading multiple files in a single request using python requests module
Multiple files with different key values can be uploaded by adding multiple dictionary entries: files = {'file1': open('report.xls', 'rb'), 'file2': open('otherthing.txt', 'rb')} r = requests.post('http://httpbin.org/post', files=files)

Categories : Misc

How to limit download rate of HTTP requests in requests python library?
There are several approaches to rate limiting; one of them is token bucket, for which you can find a recipe here and another one here. Usually you would want to do throttling or rate limiting on socket.send() and socket.recv(). You could play with socket-throttle and see if it does what you need. This is not to be confused with x-ratelimit rate limiting response headers, which are related to a number of requests rather than a download / transfer rate.

Categories : Python

python-requests returning unicode Exception message (or how to set requests locale)
You can try os.strerror, but it would probably return nothing or the same non-English string. This hard-coded English was scraped from here: http://support.microsoft.com/kb/819124 ENGLISH_WINDOWS_SOCKET_MESSAGES = { 10004: "Interrupted function call.", 10013: "Permission denied.", 10014: "Bad address.", 10022: "Invalid argument.", 10024: "Too many open files.", 10035: "Resource temporarily unavailable.", 10036: "Operation now in progress.", 10037: "Operation already in progress.", 10038: "Socket operation on nonsocket.", 10039: "Destination address required.", 10040: "Message too long.", 10041: "Protocol wrong type for socket.", 10042: "Bad protocol option.", 10043: "Protocol not supported.", 10044: "Socket type not supported.",

Categories : Python

Consecutive requests with python Requests.Session() not working
In the lastest version of requests, he sessions object is with Cookie Persistence, look the requests Sessions ojbects docs. So you don't need add the cookie artificialy. Just import requests s=requests.Session() login_data = dict(userName='user', password='pwd') ra=s.post('http://example/checklogin.php', data=login_data) print ra.content print ra.headers ans = dict(answer='5') r=s.post('http://example/level1.php',data=ans) print r.content Just print the cookie to look up wheather you were logged. for cookie in s.cookies: print (cookie.name, cookie.value) And is the example site is yours? If not maybe the site reject the bot/crawler ! And you can change your requests's user-agent as looks likes you are using a browser. For example: import requests s=requests.Session() headers

Categories : Python

Shut down SocketServer on SIG*
Disclaimer: I have 0, nil, null, none, no experience with python. Disclaimer 2: I, in no way, think that your server is "the way to go" when it comes to...anything server related, not even for the most basic things or anything outside school homework stuff; it might be a decent sample to help people learn the basics but it is, at the same time, misleading and wrong on so many levels I lost count. Back to your problem. I took your code and modified it to work as intended: #!/usr/bin/python import signal import SocketServer import threading import thread class DummyServer(SocketServer.BaseRequestHandler): def handle(self): data = self.request.recv(1024) self.request.send(data) return def shutdownHandler(msg,evt): print "shutdown handler called. shutting d

Categories : Python

SocketServer no modules are getting import
Seems like you have SocketServer.py somewhere in python path. Check using following command: python -c "improt SocketServer; print(SocketServer.__file__)" Renaming that file will solve your problem. UPDATE Rename the file /Users/ddl449/Projects/visualization/SocketServer.py. If there is /Users/ddl449/Projects/visualization/SocketServer.pyc, remove that file.

Categories : Python

Best way to upload multiple files as part of a REST API? Single or multiple POST requests?
To do this, you'd need your client to upload in mime/multipart format. I don't know PHP, but I'm sure there's a library out there that will support receiving/parsing the multipart messages you get. As for whether it's a good idea .. If initiating the request is the creation of a single resource, it's not unreasonable to accept mime/multipart. If the parts being sent are themselves full-fledged resources, it would probably be better to make the client send them up separately, and reference them in the initiation request. Also note that mime/multipart is going to be a bit harder for your clients to deal with than simple requests. This post seems to be related to what you're trying to accomplish.

Categories : Api

How to get input from SocketServer MyTCPHandler to another class
Since both classes are in the same process (and hence same memory space), why not just use a shared data. If you are worried about data being overwritten by threads, then you can add a lock on that data. You could also consider making the second class an instance of the first one -- in that case, sharing data would be seamless. Here is a discussion, you might find useful: How to share data between two classes Try this (you can remove the lock, if you have only one client and it is okay to be overwritten): lock = threading.Lock() data_title = "" class Network(Tk): def __init__(self,server): Tk.__init__(self) self._server=server t = threading.Thread(target=self._server.serve_forever) t.setDaemon(True) # don't hang on exit t.start() se

Categories : Python

Qt. How to send multiple simultaneous requests from multiple proxy servers
QNetworkAccessManager supports multiple concurrent requests, there's no need for you to use multithreading directly unless that makes things easier on your end. To track multiple subsequent requests belonging to each other, I'd suggest to use the Command Pattern to group them: Each flow of communication to one of the proxies would be managed by a job object. See this other answer where I describe why I find the command pattern useful, in a very similar context.

Categories : Qt

Requests library crashing on Python 2 and Python 3 with
This means that the server did not send an encoding for the content in the headers, and the chardet library was also not able to determine an encoding for the contents. You in fact deliberately test for the lack of encoding; why try to get decoded text if no encoding is available? You can try to leave the decoding up to the BeautifulSoup parser: if response.encoding is None: soup = bs4.BeautifulSoup(response.content) and there is no need to pass in the encoding to BeautifulSoup, since if .text does not fail, you are using Unicode and BeautifulSoup will ignore the encoding parameter anyway: else: soup = bs4.BeautifulSoup(response.text)

Categories : Python

How to structure Python code to support multiple releases of MY project (i.e. not multiple versions of Python)
Your subprocess offers an API to the web sites. The trick is to make it so that API v2 of the subprocess code can handle calls from both v1 and v2 web sites. This is called backward compatibility. Also, it's nice if the v1 web site is not too picky about the data it receives and can for instance handle a v2 answer from the subprocess that has more information than it used to have in v1. This is called forward compatibility. Json and xml are good ways to achieve it, since you can add properties and attributes at will without harming the parsing of the old properties. So the solution, I think, does not lie in a python trick, but in careful design of the API of your subprocess such that the API will not break as the subprocess's functionality increases.

Categories : Python

C# Set Credential Only Once For Multiple Requests
Passing NetworkCredential to HttpWebRequest in C# from ASP.Net Page This link is showing NetworkCache, this may be a solution for you.

Categories : C#

Redis multiple requests
You can easily refactor your code to collapse the 15 requests in one by using pipelines (which redis-rb supports). You get the ids from the sorted sets with the first request and then you use them to get the many keys you need based on those results (using the pipeline) With this approach you should have 2 requests in total instead of 16 and keep your code quite simple. As an alternative you can use a lua script and fetch everything in one request.

Categories : Ruby

php ajax multiple requests
You can use the same file for multiple requests. You can supply parameters along with the AJAX request, either by including them in the URL after ? (they'll be available in $_GET and $_REQUEST) or by using the POST method and sending them as form data (they'll be available in $_POST and $_REQUEST). You can use the Javascript FormData API to encode this properly; see the documentation here. Using the jQuery library can simplify all of this. One of the parameters can then be a command or operation code, and the script can take different actions based on this.

Categories : PHP

Multiple requests using EasyPHP
Well I`m not sure what are you asking, but easyphp is just apache server so you can make as many virtual hosts as you wish... so that way you can make one script1/ and second script2/ So that way if you write to your browser script1/ you will be running first website... and you can on other tab write script2/ and run second website in same time... you can look here: Working on multiple sites with an easyPHP offline server or just google apache virtual host....

Categories : PHP

Multiple JSON requests in iOS
There are much better ways of doing this. The problem with what you are trying to do is that it is synchronous, which means your app will have to wait for this action to be completed before it can do anything else. I definitely would recommend looking into making this into an asynchronous call by simply using NSURLConnection and NSURLRequests, and setting up delegates for them. They are relatively simple to set up and manage and will make your app run a million times smoother. I will post some sample code to do this a little later once I get home. UPDATE First, your class that is calling these connections will need to be a delegate for the connections in the interface file, so something like this. ViewController.h @interface ViewController: UIViewController <NSURLConnectionDele

Categories : IOS

Multiple each and ajax requests
The problem is that when acts on deferred objects, however sub doesn't return anything so when fires right away. So what you need to do is to collect all the deferred objects returned by the ajax calls and return them: var order = []; function sub(selector){ var deferredList = [] selector.each(function(){ var out = { "some":"random", "stuff":"here" }; var deferred = $.ajax({ type: "POST", url: "/test/url", dataType: 'json', contentType: "application/json; charset=utf-8", data:JSON.stringify(out), success:function(response){ $(this).attr("data-response",response); order.push(response);

Categories : Javascript

python-requests - can't login
There are lots of options, but I have had success using cookielib instead of trying to "manually" handle the cookies. import urllib2 import cookielib cookiejar = cookielib.CookieJar() cookiejar.clear() urlOpener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cookiejar)) # ...etc... Some potentially relevant answers on getting this set up are on SO, including: http://stackoverflow.com/a/5826033/1681480

Categories : Python

Python : Soap using requests
It is indeed possible. Here is an example calling the Weather SOAP Service using plain requests lib: import requests url="http://wsf.cdyne.com/WeatherWS/Weather.asmx?WSDL" #headers = {'content-type': 'application/soap+xml'} headers = {'content-type': 'text/xml'} body = """<?xml version="1.0" encoding="UTF-8"?> <SOAP-ENV:Envelope xmlns:ns0="http://ws.cdyne.com/WeatherWS/" xmlns:ns1="http://schemas.xmlsoap.org/soap/envelope/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/"> <SOAP-ENV:Header/> <ns1:Body><ns0:GetWeatherInformation/></ns1:Body> </SOAP-ENV:Envelope>""" response = requests.post(url,data=body,headers=headers) print

Categories : Python

What is the correct way to use python Requests
So, I've looked at the documentation and... I think it automatically keeps your session alive for you. Let me know if you have any problems with dying sessions, but assume that Requests will deal with that for you. I may have misinterpreted the docs, but I don't think you need to worry about it. From the documentation: Keep-Alive Excellent news — thanks to urllib3, keep-alive is 100% automatic within a session! Any requests that you make within a session will automatically reuse the appropriate connection! Note that connections are only released back to the pool for reuse once all body data has been read; be sure to either set stream to False or read the content property of the Response object.

Categories : Python

Java Socket multiple requests
I would recommend you to use the while just for accepting clients. Create a new thread where you handle everything related to Socket remote. The thread should also have a while loop, where it reads from remotes InputStream and don't close the socket right after creation. Then you can see what exactly your browser sends. Because the way it is right now. You close the socket to the browser right after creation.

Categories : Java

multiple requests accessing the same web method
I am assuming by web method you mean a method in your code that a servlet container like tomcat's catalina would map a HTTP request to. Tomcat tries to service each request in its own thread and I would assume these threads would eventually get run on the sole instance of the singleton object that has the web method. The maxThreads attribute in server.xml can set a limit on how many such threads would get spawned at a time.

Categories : Multithreading

Multiple requests to same resource using Restangular
After further investigation is seems that Restangular does not implement a feature to limit requests for same resource. The number of requests that go out to the server is dependent on the browser: Chrome only sends out only one GET request, IE 10 sends out two.

Categories : Angularjs

Only last of multiple ajax requests gets completed
It's at least partially because each new XMLHttpRequest is being set to the same global x, which can only keep 1 of them. This means later references to x.readState and x.responseText aren't always referring to the "correct" instance. You'll want to declare x when or before setting it so it's scoped and unique to each Ajax request: var x = new XMLHttpRequest(); For more info, see Difference between using var and not using var in JavaScript.

Categories : Javascript

Multiple database requests with one php function
Of course , the $row variable doesn't exist in the function. You need to pass it as a parameter. function action($row) { ...code... //Do something with $row } foreach($result1 as $row) { action($row); } foreach($result2 as $row) { action($row); }

Categories : PHP

Speed up web requests by making multiple at once?
Considering you have roughly 7mbit/s (1MB/s counting high). If you get 2.888 pages per second (10'400 pages per hour). I'd say you're maxing out your connection speed (especially if you're running ADSL or WiFi, you're hammering with TCP connection handshakes for sure). You're downloading a page roughly containing 354kB of data in each of your processes, which isn't half bad considering that's close the the limit of your bandwidth. Take in account for TCP headers and all that happens when you actually establish a connection (SYN, ACK.. etc) You're up in a descent speed tbh. Note: This is just to take in account the download rate which is much higher than your upload speed which is also important considering that's what actually transmits your connection request, headers to the web server

Categories : Python

How do I share objects between multiple get requests in PHP?
I would not implement the cache on the (PHP) application level. REST is HTTP, therefore you should use a caching HTTP proxy between the internet and the web server. Both servers, the web server and the proxy could live on the same machine as long as the application grows (if you worry about costs). I see two fundamental problems when it comes to application or server level caching: using memcached would lead to a situation where it is required that a user session is bound to the physical server where the memcache exists. This makes horizontal scaling a lot more complicated (and expensive) software should being developed in layers. caching should not being part of the application layer (and/or business logic). It is a different layer using specialized components. And as there are well kn

Categories : PHP

How to handle multiple requests using HttpURLConnection in a MVC app?
you will have to use Proxy while opening a connection... the use of proxy provides always a new IP address for the server so you can be sure that the server maintains different session for each request... your code will be something like following... CookieHandler.setDefault(new CookieManager(null, CookiePolicy.ACCEPT_ALL)); URL url = new URL("http://google.com"); HttpURLConnection connection = (HttpURLConnection) url.openConnection(new Proxy("some_proxy"));

Categories : Java

Multiple Ajax Requests per MVC 4 View
I redesigned the way I do my context now. I have my context then I implement IDbContextFactory<TContext> called DefaultContextFactory<MyContext> and I inject them. In the Repository I have in the public constructor _context = contextFactory.Create();. Then throughout the repository i just use _context.WhatEver() and its fine. I also did in the ModuleLoader Bind<IRepository>().To<DefaultRepository>().InTransientScope() in order to make every call to it create a new repository! I don't need a repository factory because I only have one repository!

Categories : C#

SOAP requests with multiple namespaces
To my knowledge, Spyne does the right thing there and the request is incorrect. Child elements are always under parent's namespace. The children to those elements can be in own namespace. <a:Foo xmlns:a="www.example.com/schema/a" AttrA="a1" AttrB="b2"> <a:Baz xmlns:b="www.example.com/schema/b" AttrC="c3"/> <a:Bar>blah</a:Bar> </a:Foo> That said, you can just use the soft validator which doesn't care about namespaces.

Categories : Python

How do I handle multiple web requests one at a time?
Running rails s on a project only allows you to handle one web request at a time. However, you should not be using this in production. In production, use something like Passenger which automatically start multiple processes as needed so that you don't have to wait on a process to complete another user's action. Hope that helps!

Categories : Ruby On Rails

Multiple HTTP requests in one AsyncTask
I suggest using Google Volley. It's a great and simple library for networking and remote image loading. Volley "hides" the whole threading issue in its core, all you have to do is issue a request. No need to manage AsyncTasks. Check it out.

Categories : Android

How does python know that you need to interface the requests module through api.py?
See here: https://github.com/kennethreitz/requests/blob/master/requests/__init__.py E.g. if 'requests' is a directory, which has __init__.py, Python executes this file each time it sees from requests import ... or import requests. See more in Modues.

Categories : Python

Using Python's requests library instead of cURL
You're sending params, not data: p = requests.post(token_url, params = data) When you pass a dictionary as a params argument, requests tries to send it as part of the query string on the URL. When you pass a dictionary as a data argument, requests will form-encode it and send it as the POST data, which is the equivalent to what curl's -F does. You can verify this by looking at the request URL. If print(p.url) shows something like http://api.instagram.com/oauth/access_token?client_id=xxxxxx&client_secret=xxxxx&…, that means your parameters ended up on the URL instead of in the post data. See Putting Parameters in URLs and More complicated POST requests in the quick-start documentation for full details. For more complicated debugging, you may want to consider pointing bot

Categories : Python

Shutting off URL Encoding in Python Requests
Doesn't look like it can be done any longer. Every URL gets passed through requote_uri in utils.py. And unless I'm missing something, the fact this API wants JSON with spaces in a GET parameter is a bad idea.

Categories : Python



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.