Limit maximum length of PHP array

  • wpas
  • Proficient
  • Proficient
  • User avatar
  • Posts: 378
  • Loc: Canada

Post 3+ Months Ago

I create an output file called $output which is a number of text lines with line breaks
These lines are created on a webpage for which I then use cURL to get the contents which are the text lines with line breaks.

I then create an array as follows:

$arrayoutput = explode("\n", $output);

This works quite good until the $output file becomes so large I get the following error:

Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes) ---> 128M

The $output file is not always that big, but on occasion can be.

Now I do not want to increase memory levels or anything like that.
I would like to be able to limit the maximimum length of the array.

Let's say, create the array with only, 100,000 lines maximum in it, or whatever to be within memory limits.
When it reaches 100,000 elements, it stops

Is it possible to do this?
  • Bogey
  • Genius
  • Genius
  • Bogey
  • Posts: 8489
  • Loc: USA

Post 3+ Months Ago

I haven't done that and don't have much time to mess with this at the moment (almost midnight here and I'm going to sleep) but a quick search yielded SplFixedArray. Maybe that will help you out at least to get on the right track.
  • spork
  • Brewmaster
  • Silver Member
  • spork
  • Posts: 6302
  • Loc: Seattle, WA

Post 3+ Months Ago

Don't use explode(), instead write a function to produce the array by iteratively regex-matching a line and adding it to the array, stopping if you've reached max size.
  • Zealous
  • Guru
  • Guru
  • User avatar
  • Posts: 1305
  • Loc: Sydney

Post 3+ Months Ago

what kind of reports are you trying to build for that size lol
  • devilwood
  • Silver Member
  • Silver Member
  • User avatar
  • Posts: 447

Post 3+ Months Ago

It looks like you are parsing in some type of delimited file which each row is usually determined by the line break "\n".

So you load each row into an array so it makes sense that something with hundreds of thousands rows is crapping out your memory. There are a few things you can look into.

1. PHP some parsing commands will literally go line by line so you can just limit the loop. This would not be an explode() command but something like fread() or if you have php 5+ then you should have a stream_get_line() that may work nicely for you.

2. why do you need to store in array? Figure out exactly what type of post-processing you are wanting to do once you have your array and there may be a better way of storing that information such as just writing each row into a db table rather than storing in an array.

3. reconfig webserver or pick a new one. I did a ton of text based processing for a supply chain back in 2008 and I ran into memory errors all the time. I ended up swapping from Apache to Lighttpd and for my project I never hit a memory error again. My server hardware was just a workstation converted to a webserver so I didn't have the best hardware and lighttpd seemed to work much better.
  • bruce-mesnekoff
  • Born
  • Born
  • bruce-mesnekoff
  • Posts: 3
  • Loc: Wesley Chapel, FL 33544, USA

Post 3+ Months Ago

There is no max on the limit of an array. There is a limit on the amount of memory your script can use.

Thanks
Bruce Mesnekoff

Post Information

  • Total Posts in this topic: 6 posts
  • Users browsing this forum: No registered users and 47 guests
  • You cannot post new topics in this forum
  • You cannot reply to topics in this forum
  • You cannot edit your posts in this forum
  • You cannot delete your posts in this forum
  • You cannot post attachments in this forum
 
 

© 1998-2017. Ozzu® is a registered trademark of Unmelted, LLC.