First, want to say thanks again for this feature. It is awesome!!!
My question though is, is there a limit to the size or number of lines? I get an Out of Memory for Strings error at 2,286 lines and 80,577 characters. I am iterating through a text file, line by line. Using @LF as the delimiter. I did a replace on @CRLF.
I tried just repeating the word hello instead of using long lines of text and I could have 5457 lines and 38,204 characters. Same error if I added another line.
I do know I could use FileRead but would be good to know if there is a limit or a bug.
Thanks.
Jim
I don't know about this case, but I will say that there is something strange with this feature when used in large programs with big arrays and maps and stuff. So far, my response has been to replace the Foreach-with-delimiter loop with an old ItemExtract loop, and the problem goes away.
It has resisted my attempts to create a "smoking gun" sample -- when I try to create a simple case, it always works perfectly. Which just means my case was too simple. I haven't given up, I just haven't got it yet.
Thing is, you can't conclude much from this report: until I find a smoking gun, the most likely explanation is simply an unexpected delimiter in the string.
That is why I tried the lines of "Hello" as I thought the same thing. Appreciate the reply.
Quote from: kdmoyers on Today at 07:34:29 AMI don't know about this case, but I will say that there is something strange with this feature when used in large programs with big arrays and maps and stuff. So far, my response has been to replace the Foreach-with-delimiter loop with an old ItemExtract loop, and the problem goes away.
It has resisted my attempts to create a "smoking gun" sample -- when I try to create a simple case, it always works perfectly. Which just means my case was too simple. I haven't given up, I just haven't got it yet.
Thing is, you can't conclude much from this report: until I find a smoking gun, the most likely explanation is simply an unexpected delimiter in the string.
I agree with you though -- I dearly love this foreach delimiter feature, so I will continue to lean on it whenever time allows.