Best way for 1 PHP script to handle large return

  • devilwood
  • Silver Member
  • Silver Member
  • User avatar
  • Posts: 436

Post 3+ Months Ago

I'm using an API to pull data from a cloud service. I want to pull some 30,000 records and it will only grow. The query simply gets records whose created date is after a date I specify. I want to break up the data by changing the query to different dates. What is the best way using PHP to have 1 script continually send the query and process the return? Do I just loop the query and handle the return (which is XML) then free my variables or is there a more sophisticated way? I've done memory freeing code in the past when I had a memory hog script and it didn't seem to do much and I think I remember just finding another way of doing it where I didn't have to manage memory and time. So, I thought I'd check some opinions here. Also, on my test machine I have access to the php.ini but the machine the script will run on I won't so I will have to let the script make any php.ini changes at run-time.
  • Anonymous
  • Bot
  • No Avatar
  • Posts: ?
  • Loc: Ozzuland
  • Status: Online

Post 3+ Months Ago

  • SpooF
  • ٩๏̯͡๏۶
  • Bronze Member
  • User avatar
  • Posts: 3422
  • Loc: Richland, WA

Post 3+ Months Ago

Is the "query" just a page request that returns a XML file? I know how to do this with SQL but I'm not quite sure I understand how you are making the request.

If its just a request to a page you might be able to use curl to download the file to your hard drive and then interact with it how you like.
  • devilwood
  • Silver Member
  • Silver Member
  • User avatar
  • Posts: 436

Post 3+ Months Ago

I'm using Quickbase API for PHP. It uses curl to send the HTTP request with the proper headers along with the authorized token and query. I actually think the problem is with their system getting the query and realizing it's returning too much data and just cancelling so I never have a chance to download the file. Their support said they really don't know how much the data limit is and that 30k records is probably too much and I could try dates to split the data up. However, I was able to split it up in 10k sections using something like an offset parameter they have on their API. So, I return the xml with 10k records, parse the xml, and then I unset() my results variable (which holds the xml response from them) and I empty all my other variables. It worked so I figured it wasn't going over my php.ini setting for memory. I'd rather tweak my script and do everything I can in my script for memory management than change my php.ini cause I have a lot of cronjobs (un-manned scripts) so I'd hate it if something happened and caused one not to terminate. So, I didn't want to set all the data managements in php.ini to unlimited.

Post Information

  • Total Posts in this topic: 3 posts
  • Users browsing this forum: No registered users and 67 guests
  • You cannot post new topics in this forum
  • You cannot reply to topics in this forum
  • You cannot edit your posts in this forum
  • You cannot delete your posts in this forum
  • You cannot post attachments in this forum
 
cron
 

© 1998-2014. Ozzu® is a registered trademark of Unmelted, LLC.