w3hello.com logo
Home PHP C# C++ Android Java Javascript Python IOS SQL HTML videos Categories
Is performing computations inside cellForRowAtIndexPath heavy?
cellForRowAtIndexPath is called each time a cell is shown or changed (by scrolling), so the source executed here should not be too much, but it depends on your data. If you have to carry a big number of data and prepare all data before the loading time will increase and your device runs out of memory.

Categories : IOS

MongoDB High Avg. Flush Time - Write Heavy
Without seeing any disk statistics, I am of the opinion that you are saturating your disks. This can be checked with iostat -xmt 2, and checking the %util column. Please don't disable journalling - you will only cause more issues later down the line when your machine crashes. Separating collections will have no effect. Separating databases may, but if you're IO bound, this will do nothing to help you. Options If I am correct, and your disks are saturated, adding more disks in a RAID 10 configuration will vastly help performance and durability - more so if you separate the journal off to an SSD. Assuming that this machine is a single server, you can setup a replicaset and send your read queries there. This should help you a fair bit, but not as much as the disks.

Categories : Mongodb

High performing Bitmap Drawing solution in c#
How important is the image quality? If you need faster results, you should try InterpolationMode.NearestNeighbour, this should be much faster, but results will be rather low-quality. HeighQualityBicubic (what you're currently using) produces best results, but at lowest performance.

Categories : C#

Azure Could Computing high availoability vs NEO4J high availability?
The short answer is probably yes. Windows Azure provide you infrastructure that allow you to build high availability system, it won't make any system high available by magic. As NEO4J is state-full, each node (with only one node Azure don't give you any SLA, you instance will be down) will need to share some state and the way to do it will be dependent on how NEO4J is working. So you will need to rely on NEO4J mechanism to do it. I don't know how NEO4J is working but you won't be able to skip designing an high available architecture around NEO4J using Windows Azure infra. Cloud may be a magic buzz word that can make things append on management level, but when we are on hard real world level Harry magic wand doesn't exist.

Categories : Neo4j

quicksand sorting price values low to high / high to low
val() gives you a string, so > and < comparisons are lexographical (not numeric). Try wrapping the values in parseInt() or parseFloat(). Make sure to add appropriate error handling as well.

Categories : Jquery

Varnish High DB Connections In High Traffic
Mostly spiking backend connections has little to do with your varnish configuration but alot to do with the cachability of your site. Are there cookies that prevents you from caching efficiently? You can chose to strip them or remove all but chosen ones, there are examples for both on the varnish site. Do a varnishstat and check your hit rates during peaks. Is it a good cache hitratio? Is it the same as during low load? If it's the same or higher in low load it's easy to work on improving it at any time. Do a varnishtop -i txurl to see what requests are the most frequently sent to backend servers. Maybe it's some URLs that are just not cached due to faulty headers? Maybe some pages can be cached longer? Maybe some parts of the pages can be cached with ESI? Make sure your varnish is not

Categories : Mysql

When to use low < high or low + 1 < high for loop invariant
If your invariant is that the target must lie in low <= i <= high, then you use while (low < high); if your invariant is that the target must lie in low <= i < high then you use while (low + 1 < high). [Thanks to David Eisenstat for confirming this.]

Categories : Algorithm

Can AJAX be used on heavy DB operations
Yes it can be, just give some sort of feedback to the user that the operation takes a while. Display a message or loading image while the operation is being performed, and then hide it when the operation is done.

Categories : C#

BBM bar export fail on heavy APK
Well it looks like, from what you have there is a java.lang.IllegalArgumentException Thrown to indicate that a method has been passed an illegal or inappropriate argument. From looking at your error message there it should be a problem with one of your images, and i don't think it because the program re-sized the icon. If there's more image types in your program check them out, you might need a try and catch statement. Also http://docs.oracle.com/javase/7/docs/api/java/lang/IllegalArgumentException.html just shows the exception, i just used google to find it.

Categories : Misc

Nodejs on heavy load
First, I'm not sure how you have a clock speed of 13.6ghz for a single thread. I'd assume your CPU has multiple cores, or your mobo supports multiple processor sockets, and 13.6 is simply a sum. (8.2GHz was a world record that was set July 23rd, 2013.) Secondly, I'd ask yourself why the disconnect is happening. Are you watching processor load - is a single thread maxing out its processor allocation (i.e. 100% usage on a single core)? How's your RAM consumption; is it climbing, and has the OS offloaded memory onto the page file/swap partition - could there be a memory leak? Is your network bandwidth capped? Has it reached its maximum capacity? My high-level recommendations are: Make sure your application is non-blocking. This means using asynchronous methods whenever possible. By des

Categories : Node Js

AngularJS - Computation-Heavy Tasks
Because JavaScript is single threaded, you need to either do the computations server-side, or do the timeouts in between the processing (See underscore's defer(), http://underscorejs.org/#defer). Otherwise, the UI will inevitably get blocked.

Categories : Angularjs

Creating heavy user controls in WPF
200 controls shouldn't pose that big of a problem to render WPF on a decent machine can take a few thousand primitives. You can show a progress bar while loading your data and while parsing it. Then you can throttle creating the UI elements if needed by having and off-UI-thread process loop over your data and call UI thread to instantiate controls. You can even separate instantiations by a small sleep to let the screen render, but only use this for VERY heavy UI... ... that being said - if your UI is so heavy you're probably designing it wrong. The question should not be "how many UI elements can I put before my UI slows down to a drag?" but "what's the smallest number of active UI elements that can do the job?". The word "active" refers to the approach taken by listviews where the

Categories : C#

triple buffer heavy flickering
Try this: Make a BufferedImage in your myPanel class like this: private BufferedImage image = new BufferedImage(WIDTH, HEIGHT, BufferedImage.TYPE_INT_RGB); Now draw that image ABOVE everything else in your draw method like this: public void draw(){ BufferStrategy bs = getBufferStrategy(); if (bs== null){ createBufferStrategy(3); return; } Graphics g = bs.getDrawGraphics(); g.drawImage(image, 0, 0, getWidth(), getHeight(), null); // Draw your other stuff after this... g.fillOval(20, 20, 20, 20); g.dispose(); bs.show(); } This will draw a black background onto the screen so that it does not flicker anymore. Now you can draw anything on top of that image.

Categories : Java

Hbase Heavy write Exception
This is caused by not heavy write but big write. "processingtimems":761893 means the write operation is not finished in 761 sec. And before the action is finished, client is timeout. Try to reduce the multi operation item count.

Categories : Hadoop

Android when to save heavy data?
One suggestion would be to startActivityforResult from the gallery activity. When the picture is edited, setResult with that bitmap in the second activity. In onActivityResult, get the bitmap and show it in the gallery of the edited picture. Parallelly write the image to file in a thread.

Categories : Android

Load very heavy stream with GSON
From my experience, yes you can use google GSON to stream JSON data this is an example how to do it : APIModel result = new APIModel(); try { HttpResponse response; HttpClient myClient = new DefaultHttpClient(); HttpPost myConnection = new HttpPost(APIParam.API_001_PRESENT( serial_id, api_key)); try { response = myClient.execute(myConnection); Reader streamReader = new InputStreamReader(response .getEntity().getContent()); JsonReader reader = new JsonReader(streamReader); reader.beginObject(); while (reader.hasNext()) { String name = reader.nextName(); if (name.equals("

Categories : Java

Read heavy text file
You're parsing JSON. You could make the input file smaller by removing prettifying (e.g. indentation, newlines etc) if it's there. You could also try a parser that reads directly from streams, hopefully it won't need to buffer everything at once. For example, Android provides JsonReader, which allows you to parse a stream and control the data structures yourself, which means you could use more memory efficient structures, and it also wouldn't buffer the whole stream. Unfortunately, it was added in API level 11, so backward compatibility might be an issue. One alternative is, if the top level object is an array, split it into several smaller arrays, maybe in different files, parse them separately and merge the subarrays. If the base objects have similar structures you can translate them

Categories : Java

Indexing Heavy dataset in Solr
It could be because of caching, sure. Hard to say without more information. However, I would say, no, you should not turn off document caching, please see the documentation on documentCache. The size for the documentCache should always be greater than <max_results> * <max_concurrent_queries>, to ensure that Solr does not need to refetch a document during a request. You might be able to scale your cache settings back somewhat, if necessary. Referring back to the documentation above, you could take it's advice about lazy loading your documents. A better approach might be: You can not store huge datasets in the index. A very typical pattern is to index large datasets, but store them entirely external to the index, and fetch them from whatever external datasource you

Categories : Solr

Why is node.js not suitable for heavy CPU apps?
Node is, despite its asynchronous event model, by nature single threaded. When you launch a Node process, you are running a single process with a single thread on a single core. So your code will not be executed in parallel, only I/O operations are parallel because they are executed asynchronous. As such, long running CPU tasks will block the whole server and are usually a bad idea. Given that you just start a Node process like that, it is possible to have multiple Node processes running in parallel though. That way you could still benefit from your multithreading architecture, although a single Node process does not. You would just need to have some load balancer in front that distributes requests along all your Node processes. Another option would be to have the CPU work in separate pr

Categories : Javascript

authentication for a REST Api designed for 1 consumer and heavy use
It depends on your audience. If this is for internal consumption only, checking the ip or adding basic auth through https should be more than enough. If you are planning on exposing your business to other clients, then you might want to implement a more complex auth mechanism. I like what Amazon uses but you guys can tweak this to meet your own needs.

Categories : Api

Microsoft SQL Server backup file very heavy
The number of rows is almost irrelevant - it's the size of the data within those rows that counts. Your friend might have every row containing half the amount of data in the Linking column, for example.

Categories : SQL

Heavy issue here trying to upload data to a database
$db is connecting to the database using the mysql method, but you are querying based on the mysqli methods. There are 2 things you need to do here to have an idea of what is going on. Firstly, change all your mysql_ calls to mysqli_ calls, and add some error reporting (so for example adding or die (mysqli_error($db); to the end of every line where you query) should point you in the right direction. Your first glaring problem here is that you conneced to the DB using mysql_connect, but are then trying to query that connection using mysqli. Use one, not both. Also, your SQL Query should read INSERT INTO table (stuff) VALUES ($stuff) rather than INSERT INTO table SET stuff = '$stuff'

Categories : PHP

Wcf NetNamedPipesBinding replying slow on heavy load
If you are creating a separate Thread for each request, you could be starving your system. Since both client and server are on the same machine, it may be the client's fault the server is slow. There are lots of ways to do multithreading in .NET and a new Thread may be the worst. At worst you should move your calls to the thread pool (http://msdn.microsoft.com/en-us/library/3dasc8as.aspx) or you may want to use the async methods of the proxy (http://msdn.microsoft.com/en-us/library/ms730059.aspx).

Categories : C#

Settimeout UI refresh and heavy loops strategy
Your timeouts will all fire at almost the same time - ie. the second will not wait for the first to finish. What you do with load_data() function is you set three timeouts, all three scheduled in 300ms, and then return. Then in about 300ms, read1, read2, and read3 will be called - each with its own scope. You could do this by using flow.js (https://github.com/willconant/flow-js) Or on your own. call read1, set scope to "this" of load_data() have read1 emit "done1" at its end have load_data() listen to 'done1' and fire read2 when the event is caught. You can modify this logic to include as many steps as you want.

Categories : Javascript

What is the right way to load heavy Task on android ViewPager?
You can implement ViewPager.OnPageChangeListener and run your AsyncTask in onPageSelected(). For example: public class MyActivity implements ViewPager.OnPageChangeListener { @Override public void onPageSelected(int position) { new MyAsyncTask().execute(); } } However, as Tyczj pointed out in the comments, this defeats the purpose of a ViewPager trying to keep Views loaded. This feature is designed to make your app look smooth, and without it your Views will look empty (or take on their default appearance) while you load your data.

Categories : Android

SQL vs. NoSQL database for 'tags-heavy' CRM application
Our ERP system is based on UniData (NoSQL), it is okay for performing the standard tasks needed to do business like entering in customers, creating sales orders, invoicing etc. But when it comes to creating reports that were not originally foreseen it is quite cumbersome. The system only lets you create reports off of one table, if you need data from another table you have two options: 1. Create what is called a virtual attribute for every field you need to look up from a different table, Or write a UniBasic program to retrieve the data needed. To meet most of our business needs on the reporting front it is more beneficial for us to export the Data to SQL and then perform reports in SQL, the result is the reports run quicker from SQL and most of the time a reporting tool can be used to cr

Categories : SQL

How to LRU-cache numerous objects made of C++ STL heavy structures?
I actually built caches (not only LRU) recently. Options 2 and 3 are quite likely not faster than re-reading from disk. That's effectively no cache at all. Also, this would be a far heavier dependency than Boost. Option 1 can be challenging. For instance, you suggest "a lock". That would be quite a contended lock, as it must protect each and every lifetime update, plus all LRU operations. Since your objects are already heavy, it may be worthwhile to have a unique lock per object. There are intermediate variants of this solution, where there is more than one lock, but also more than one object per lock. (You still need a key to protect the whole map, but that's for replacement only) You can also consider if you really need strict LRU. That strategy assumes that the chances of an object b

Categories : C++

Is there a way to tuneup a complex count query against heavy data?
Possible this be helpful for you - SELECT pa.* FROM empl JOIN ( SELECT pa.aID , cnt = COUNT(pa.pid) FROM pam pa GROUP BY pa.aID ) pa ON empl.pid = pa.pid Or this - SELECT pa.aID, COUNT(pa.pid) FROM pam pa WHERE EXISTS( SELECT 1 FROM empl WHERE empl.pid = pa.pid ) GROUP BY pa.aID Or even this - SELECT pa.aID , cnt = COUNT(pa.pid) FROM pam pa GROUP BY pa.aID

Categories : SQL

Which character set in Oracle supports the Microsoft heavy arrow
It's probably part of a different font (wingdings), not a special character in a standard font (e.g. Arial). Therefore, you would have to use some kind of application encoding to get this as part of a regular text -- for instance, when your application sees --> it would replace it with the wingdings character, the same way Word does.

Categories : SQL

show loading image while javascript has heavy load
Your structure should be like this showLoader(); $.ajax({ url: '', dataType: '', async: false, //<-- this depends on your needs. success: function(dataObj) { //Do you initialization of datatables hideLoader(); }, }); I believe this serves your needs. I believe it should not be asynchronous because you want the table to be there and ready before letting the user do anything. Although you could make it asynch if you don't do much after the ajax call. You don't need threads for this. ajax is asynchronous by default if you want to do something when you are waiting for the ajax (initialization of datatables included) to finish EDIT showLoader and hideLoader are your functions for showing and hi

Categories : Javascript

heavy customized routing constraint slowing site down?
You could optimise your code a bit: Use .ToString() instead of string.Format (string.format uses expensive regex) Use TryGetValue() instead of your 2 operations Init the dictionary on app start or even consider dropping the dictionary at all, an large switch statement with a returned string is faster than the dictionary

Categories : C#

Execute function on interval only if server is not under heavy load
With this particular requirements, I would perform this check on server side instead. var checkFunctionCaller = function () { //call async function on server side and provide a callback function //to be called as server returns an answer server.myCheckFunction(function (result) { // if server was able to run check - it return 'done' and everything is fine if(result === "done") { //do nothing } //otherwise, re-schedule a check in five minutes setTimeout(checkFunctionCaller, 5 * 60 * 1000); }); } setInterval(checkFunctionCaller, 3600000);

Categories : Javascript

using dbms_pipe to insert data coming from heavy transaction
A possible approach to consider: Accumulate appropriate portions (depending on the frequency) of incoming data in a local file in a delimited form (e.g. comma, or tab delimited) Once in a while start a new file and load a previous one into the database with LOAD DATA LOCAL INFILE. It's the fastest way to load batches of data in MySql. After successful load delete the file If for some reason you can't use files for buffering at least use multi insert syntax INSERT INTO table_name (col1, col2,...) VALUES (value1, value2, ...),(value3, value4,...),... building a query string for a batch of data and then execute it at once. Also take a look at Speed of INSERT Statements: To optimize insert speed, combine many small operations into a single large operation. Ideally, you make a

Categories : C++

Processing heavy operations with doGet() method on a servlet
If you want to "turn" the request into a POST request, You can simply call the doPost method from doGet but that won't change anything if you want the POST functionality. If your tool is sending large number of request parameters, then all the parameters will be appended in the url which is not safe at all. In case of GET request, the length of the url that can be managed succesfully also differs container to container. As told by The New Idiot, idempotency means performing the same operation over and over again without causing any side effects. The method GET is idempotent as it just asks for the resource and requesting a resource over and over again cannot cause any problem but POST consists of form submission and entering the form data more than once can cause serious problems.

Categories : Java

Requests canceled with server under heavy load hang on IIS
cancellationtoken doesnt dictates cancelation, implementer may inspect it after very long interval, or ignore it completely. the correct way implementing timeout is: if(yourTask == Task.WhenAny(yourTask, Task.Delay(3000))) { //task completed } else { //timeout occured }

Categories : C#

mysql MyISAM vs NDB storege engine for heavy reads
It supports joins of course. Speed of joines will rely on indexes and columns you are joining by... As far as my practice with NDB it is much better, the 'next generation' from myisam :) Not just it is faster, but you can have multiple nodes, inserting/writing at the same time in backend files/processes... Only floe is that it doesn't support transactions, but myisam doesn't support too, so you better avoid 'heavy' updates or deletes. You can check the comparisons here Hope I helped :)

Categories : Mysql

Simple PHP Script Makes Heavy Server Load
Maybe you can identify which one of your lookups eats the most time by checking on the times? $t0=microtime(1); $teamid=$HT->getLeague($league)->getTeam($i)->getTeamId(); echo "lookup teamid: ".(($t1=microtime(1))-$t0)."<br>"; if (if($team->getId() != 2286094) { $youthteamid=$HT->getTeam($teamid)->getYouthTeamId(); echo "lookup youthteamid: ".(($t2=microtime(1))-$t1)."<br>"; $youthteam = $HT->getYouthTeam($youthteamid); echo "lookup youthteam: ".(($t3=microtime(1))-$t2)."<br>total time: ".($t3-$t0)."<br>"; }

Categories : PHP

SQL Server putting data into temp table first before heavy join
Here are the "typicals" that I try. I usually try them out and see what happens under load and under "big data" that represents production row numbers, not dev row numbers. Going from memory. If it is "one time" use, I try to use the derived table method. If it data in the "holder" table can be reused, I start with a @variableTable if the number of rows will be small. 2.b. The only time I've seen a @variableTable screw you is if you do some aggregate results...where the "summary rows" are only a few, but to generate the summary rows, you hit a large amount of rows. Think something like "Select StateAbbreviation, count(*) from dbo.LargeTableOfData".....there will only be 50 or so rows in the result table, BUT the aggregate data comes from a large table with lots of rows. Then I to go

Categories : Sql Server

Should I use a separate REST backend project and a javascript heavy frontend?
No best answer here, it really depends on your project. Play allows you to do both anyway. Cases where you'd better go with plain HTML: If you have to support IE 6/7(/8) If your content needs to be found in search engines If you are very content oriented (texts, images) If you want your users to load the page fast Cases where it's better to choose "one page app": A lot of user interactions, keeping a state Collaborative and realtime Non hierachical navigation (graphs) A lot of data-visulisations

Categories : Api

Log4j - missing logs in zipped files under heavy load
You could try to create a copy of the log to some temp location (or at the same location, if space is not a constraint); and then zip it. It might happen that the point of time zipping is happening, the logger buffer is trying to wrte back and faling. Since log4j fails silently, it is difficult to trace such error.

Categories : Java



© Copyright 2017 w3hello.com Publishing Limited. All rights reserved.