A
ameerov
I'm writing a webspider. When a bad url is encountered then a
MalformedURLException is thrown. What I want to do is ignore this bad
URL's exception and continue reading the other urls.
I'm not very familiar with try catch blocks but as I understand there
is no way to go from catch BACK TO the try .(After encountering an
exception)
So then how is it possible to ignore this exception and thus SKIP
reading the bad URL so I can continue reading the good ones ?
Thanks.
MalformedURLException is thrown. What I want to do is ignore this bad
URL's exception and continue reading the other urls.
I'm not very familiar with try catch blocks but as I understand there
is no way to go from catch BACK TO the try .(After encountering an
exception)
So then how is it possible to ignore this exception and thus SKIP
reading the bad URL so I can continue reading the good ones ?
Thanks.