How do I clone website ? Via httrack, F12 or wget [closed]

Joined
Apr 3, 2022
Messages
1
Reaction score
0
I tried to make a copy of a site through httrack but I can't do anything. A lot of useless code appears that is useless.

I tried using the Save All resoucers extension and I have to copy every page from the website. And then I don't know how to join all the pages to make and edit a site.

I wanted to copy an entire site without useless code, and then edit it my way.

I was also told about wget. I already put the .exe file in the system 32 folder but I can't/don't know how to copy the website or see in which folder the code was.

In your opinion what is the best way to copy a website?

Thank you
 
Joined
Mar 3, 2021
Messages
240
Reaction score
30
Yep, wget's the tool of choice on this one. Try something like the following.

Code:
wget '<your URL here>'  -r -p -e robots=off --level=<how deep to go>

-r tells it to follow URLs and download more, recursively
-p tells it to "download all the files that are necessary to properly display a given HTML page"
--level tells it how many URLs deep to follow. If you're doing this on something that cross-links a lot, like Wikipedia, you'll be many, many layers deep and that may not be desirable.
-e robots=off tells it to ignore the robots.txt file, which is generally considered rude, but that's mostly for indexers and such that periodically scan sites; it may or may not be required, depending on your target.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top