B
Brian
I have been working on a data reception system.. I am still finding my
way around Javascript, though I am accomplishing much.
I just fixed a flaw that was really hard to find. The symptoms are
this:
I get a multiline string returned to Javascript from a Proxy+Google
Maps API GDownloadUrl()
The data, when added to a DOM table looked fine, about 20 lines in CSV
format
Sunrise,-119.098,35.345,0.0<br>
SwanDancer,-119.345,35.567,1.0<br>
.... etc
(I don't know why the <br>'s are there, but that's what it looks like)
So using a suggestion from this newsgroup, I perform two subsequent
split()'s
var index, index2;
var strCSVFile = data;
var arrayCSVFile;
arrayCSVFile = strCSVFile.split( "<br>" );
for ( index = 0; index < arrayCSVFile.length; index++ )
{
arrayCSVFile[ index ] = arrayCSVFile[ index ].split( ',' );
// do stuff to the elements
}
I use both strCSVFile *and* arrayCSVFile to be doubly sure I wasn't
somehow clobbering something, though in theory there needs to be only
the original string. At any rate, what I see is this (after HOURS of
trying and finally using str.charCodeAt())
10|32|32|32|32|32|32|32|32|83|117|110|114|105|115|101| len=16
10|10|32|32|32|32|32|32|32|32|83|119|97|110|68|97|110|99|101|114|
len=20
.... etc
%^!@#$^%@ <- that's cursing, people
So I am now hand clipping some number of LF and SPACE chars using
str.charCodeAt(). On top of that, my furtive attempts at RegEx
replacements along the way had been SILENTLY FAILING. Probably because
of the leading LF(s). I had no idea, and it took valuable time..
I looked for split() gotcha's but never found anything like this. I
thought I tried changing the split to "<BR>\r" at one point, but I
probably did the return instead of line feed... Also, that would NOT
handle the first line case ?!
This is what is happening, and I now have tedious code to handle it.
Looking back on the original recv'd data, it does indeed have a leading
LF|SPACE's, with two LF's on every subsequent row. I never saw them.
How could I ? When I aded them to the HTML page to check the data, they
didn't show
This was awful.
FYI
way around Javascript, though I am accomplishing much.
I just fixed a flaw that was really hard to find. The symptoms are
this:
I get a multiline string returned to Javascript from a Proxy+Google
Maps API GDownloadUrl()
The data, when added to a DOM table looked fine, about 20 lines in CSV
format
Sunrise,-119.098,35.345,0.0<br>
SwanDancer,-119.345,35.567,1.0<br>
.... etc
(I don't know why the <br>'s are there, but that's what it looks like)
So using a suggestion from this newsgroup, I perform two subsequent
split()'s
var index, index2;
var strCSVFile = data;
var arrayCSVFile;
arrayCSVFile = strCSVFile.split( "<br>" );
for ( index = 0; index < arrayCSVFile.length; index++ )
{
arrayCSVFile[ index ] = arrayCSVFile[ index ].split( ',' );
// do stuff to the elements
}
I use both strCSVFile *and* arrayCSVFile to be doubly sure I wasn't
somehow clobbering something, though in theory there needs to be only
the original string. At any rate, what I see is this (after HOURS of
trying and finally using str.charCodeAt())
10|32|32|32|32|32|32|32|32|83|117|110|114|105|115|101| len=16
10|10|32|32|32|32|32|32|32|32|83|119|97|110|68|97|110|99|101|114|
len=20
.... etc
%^!@#$^%@ <- that's cursing, people
So I am now hand clipping some number of LF and SPACE chars using
str.charCodeAt(). On top of that, my furtive attempts at RegEx
replacements along the way had been SILENTLY FAILING. Probably because
of the leading LF(s). I had no idea, and it took valuable time..
I looked for split() gotcha's but never found anything like this. I
thought I tried changing the split to "<BR>\r" at one point, but I
probably did the return instead of line feed... Also, that would NOT
handle the first line case ?!
This is what is happening, and I now have tedious code to handle it.
Looking back on the original recv'd data, it does indeed have a leading
LF|SPACE's, with two LF's on every subsequent row. I never saw them.
How could I ? When I aded them to the HTML page to check the data, they
didn't show
This was awful.
FYI