Why is wget slow




















I've read the help information of wget , it seems like there are no options could make it more faster. Is there anyway to make wget faster? Or is it possible to make it multi-threading download? Unlike many other download managers, Axel downloads all the data directly to the destination file, using one single thread.

This saves some time at the end because the program doesn't have to concatenate all the downloaded parts. I tried axel upon Gufran 's recommendation but it hugely disappointed me. I also wanted a multithreaded replacement for wget and curl , not some kludge of a script that runs multiple instances of these. So I searched further and found what I think right now is the ultimate most modern multithreaded CLI downloader there is -- aria2.

The big problem I had with axel was that it 'faked' downloading files over SSL. I caught it doing that with tcdump. It was downloading https links as ordinary http.

That really pissed me off and if I hadn't checked, I would have had a false sense of security. I doubt that many people know about this serious breach in security. Getting back to aria2 , it is more advanced than any other downloader. The man page is gigantic. I will never use more than a few of its many options.

I wrote a script that mimics DTA's behavior, if not its convenience. It creates the directories and downloads to them. It can create nested directories as shown in the second example.

I for example have quite the variation from normal to single In reply to GaKu, I don't know why would you comment something which has already been conveyed before in this thread.

Your comment has no relevance to my issue in whatsoever. Can you try to read the issue description once again. And please use code tag while mentioning a code. Did you ensure there's no side-traffic notably bittorrent? Can you try the wired interface? Does it yield different results?

Yes I ensured that there are no other applications running on my machine that could consume a huge amount of data. No qbittorrent or browser based downloads or anything. As you suggested I also tried to use Ethernet cable to check. Sadly, the result hasn't changed at all. It's the same. The server you used to test with wget is painfully unresponsive, post the cmdline you used with reflector to create your mirrorlist, or the contents of reflector.

And also the mirrorlist itself. As it has already been confirmed within this tread that the mirrorlist isn't at fault so I don't under why would you like to see my reflector. And the server that I mentioned won't respond if you change the value at the end of link like changing MB to something like 10MB. Moreover, this was just to show the speed with wget which is the same as that of pacman or git clone. I would really love to provide you any output whatsoever until you can actually establish the reason behind asking so.

Yeah, I tried to do that. Through browser it was around 1. So the problem still persists. But that's a far cry from the previously claimed disparity - and hetzner is MUCH faster. We can't say "everything normal in the browser but wget is bad" - btw.

Do you reside in the cheese and chocolate exporting nation suggested by your email address? Do you have a second system so you can test the LAN speed? Yes, I do have a second system but it is out of order at the moment so can't test my speed on that. I tried today again through browser it was 2.

Either way I don't think this issues is going any near to being resolved. I'm in India at the moment. Please leave a comment to start the discussion. Please keep in mind that all comments are moderated and your email address will NOT be published.

Save my name, email, and website in this browser for the next time I comment. Notify me of followup comments via e-mail. You can also subscribe without commenting. This site uses Akismet to reduce spam.

If the above don't solve your issue, you might also try adding the destination IP and hostname to the end of your hosts file to bypass DNS resolution, e. See also Why is my web site so slow? In brief, many more connections have to be made to set up an HTTPS connection which takes time before we can even start downloading the content, then there is overhead in encrypting the communication.

A large portion of the time waiting for a web request can be waiting for other things to happen. The main things are:. Let's say your IPv6 configuration is not working properly. You make a request with wget for www. Your server picks a nameserver round robin from your resolv. It picks DB Your IPv6 configuration is not working so we wait and wait and time out.

Then we pick another address DB We wait again.



0コメント

  • 1000 / 1000