Login ProductsSalesSupportDownloadsAbout |
Home » Technical Support » ElevateDB Technical Support » Support Forums » ElevateDB General » View Thread |
Messages 1 to 7 of 7 total |
Remote Session speed |
Thu, Jun 5 2014 5:12 PM | Permanent Link |
Lee Mc Cauley | I am a new Elevatedb user with many questions. I am taking them one at a time.
I have a remote session. On a client machine, I have to retrieve a large number of records (100,000 +) and load them into a list. Right now, this process is taking a great deal of time. On the computer where the ElevateDB Server is running, it is fast. On a client machine, it is very slow. What can I do to speed this up? Let me know if you need more details. I thought I would start with some basic information. Thank you, Lee |
Thu, Jun 5 2014 5:28 PM | Permanent Link |
Terry Swiers | Hi Lee,
> What can I do to speed this up? Let me know if you need more details. I > thought I would start with some basic information. Couple of tips. 1. Set the compression level to 5 or 6 for the connection between the client and the server. 2. Set the RemoteReadSize for the TEDBTable or TEDBQuery to 25 or 50. This will allow EDB to pull down multiple records at a time rather than pulling them one by one. --------------------------------------- Terry Swiers Millennium Software, Inc. http://www.1000years.com http://www.atrex.com --------------------------------------- |
Fri, Jun 6 2014 2:46 AM | Permanent Link |
Roy Lambert NLH Associates Team Elevate | Lee
>I have a remote session. On a client machine, I have to retrieve a large number of records (100,000 +) and load them into a list. > >Right now, this process is taking a great deal of time. On the computer where the ElevateDB Server is running, it is fast. On a client machine, it is very slow. > >What can I do to speed this up? Let me know if you need more details. I thought I would start with some basic information. Whilst I agree with what Terry says I'd approach it from a totally different angle. Whatever you do you will never reduce the time to below that of the time taken to run the query, and transfer the data (plus the associated overheads) over to the client machine. Terry has made two good suggestions. His first suggestion may or may not help, it depends on the trade off between compressing/uncompressing the data and transporting the data in a compressed form. Sometimes the data can't be compressed (or not much) so you extend the time taken. You'll need to experiment. His second suggestion essentially reduces the associated overheads, so whilst its a good idea and can have a lot of impact it may not do what you want. To me the important question is why you load them into the list and what you do with it. If its "just" calculations can these be performed on the server, if its for display can this be done in a DBGrid rather than a list, is the data needed all at once or cn it be obtained in a background thread and passed to the foreground as needed? Depending on what data you're extracting you could use the LIST command to extract it into one CLOB and then transport that and interpret at the other end. You could also create a temporary table on the server and stream the whole table across to the client (if you run the sql on the client then read each row individually to populate the list you have a tremendous amount of overhead - ie sql commands). If you can supply some less basic information we can probably give better advice. Roy Lambert |
Fri, Jun 6 2014 10:38 AM | Permanent Link |
Adam Brett Orixa Systems | Lee
My only addition to what others have said is that if the query runs fast on the server but slowly on the client there are probably issues with latency in the network, i.e. the "width of the pipe" through which the data is transferring. To some extent this cannot be solved by EDB, it requires an update to the network infrastructure. Check that the network router is decently fast and that other network issues are not blocking or slowing the data. A decently fast router is only $100 ... so it will soon pay for itself if it speeds up data transfer. Finally, the problem may relate to the updating of the List component in your GUI/Application. Remember in Delphi that loading items into a list is 1000 times faster if you set the list up correctly, rather than using the default settings. I am sure other IDEs and languages have similar issues. |
Fri, Jun 6 2014 3:08 PM | Permanent Link |
Lee Mc Cauley | Adam Brett wrote:
Lee My only addition to what others have said is that if the query runs fast on the server but slowly on the client there are probably issues with latency in the network, i.e. the "width of the pipe" through which the data is transferring. To some extent this cannot be solved by EDB, it requires an update to the network infrastructure. Check that the network router is decently fast and that other network issues are not blocking or slowing the data. A decently fast router is only $100 ... so it will soon pay for itself if it speeds up data transfer. Finally, the problem may relate to the updating of the List component in your GUI/Application. Remember in Delphi that loading items into a list is 1000 times faster if you set the list up correctly, rather than using the default settings. I am sure other IDEs and languages have similar issues. Thank you for all of the good information. It is truly appreciated. I did what Terry said and that helped. I am aware that the component begin loaded with the list could also be taking some time. In this case it is very little. I am also aware of the bottlenecks of a network. This is an application for agriculture that we wrote 15 years ago. When the program executes it loads all a great deal of records into a list. 15 years ago we started with FlashFiler than migrated to Nexus. I had some issues with the Nexus people through the years and always threatened to leave them. Now I have. I just couldn't stand the thought any more of paying more for my database each year than I do for Delphi. So much of what Nexus now can do, I do not use. In Nexus you could set a BlockReadSize on the table that did help a great deal in loading this list. I have played with this some in Elevatedb, but without much success. I am looking at trying to rip this list out, but I don't know if I am ambitious enough to do it. Today, there are so many more ways in which to handle what we are doing with this list. It would be a huge undertaking to rip it out. Most of my applications are local. But the few that are remote, are very large dairy herds. Thanks for all of your help, Lee |
Sat, Jun 7 2014 3:54 AM | Permanent Link |
Uli Becker | Lee,
in such a case (exporting 100.000+ records) I would choose a different approach. I use replication in some of my applications, one of them creates update files every few minutes and after a week or so you have tons of files to download and load the updates. I wrote some code to zip all these files, download them as one file, unzip and load them. That improved the performance significantly. You could use the same approach: just export the records, zip the resulting csv-file, download it, unzip and import it. That should be quite fast. You can do that in Delphi or write a module for your database. Have a look at the binaries, there you'll find a basic example how to zip/unzip all files of a store. Hope that helps Uli |
Sat, Jun 7 2014 7:15 AM | Permanent Link |
Lee Mc Cauley | Uli;
Thank you for the advice. This is a very interesting approach. I will take a serious look at this. Thank you very much, Lee Uli Becker wrote: Lee, in such a case (exporting 100.000+ records) I would choose a different approach. I use replication in some of my applications, one of them creates update files every few minutes and after a week or so you have tons of files to download and load the updates. I wrote some code to zip all these files, download them as one file, unzip and load them. That improved the performance significantly. You could use the same approach: just export the records, zip the resulting csv-file, download it, unzip and import it. That should be quite fast. You can do that in Delphi or write a module for your database. Have a look at the binaries, there you'll find a basic example how to zip/unzip all files of a store. Hope that helps Uli |
This web page was last updated on Tuesday, September 17, 2024 at 04:19 AM | Privacy PolicySite Map © 2024 Elevate Software, Inc. All Rights Reserved Questions or comments ? E-mail us at info@elevatesoft.com |