Use wget to download from hotfile – automated!
To automatically download from Hotfile using Wget you can do the following:
wget --save-cookies /path/to/hotfilecookie --post-data "returnto=%2F&user=1234567&pass=yourpass&=Login" -O - http://www.hotfile.com/login.php > /dev/null
This will save your login to a cookie on your server and then we can do:
wget -c --load-cookies /path/to/hotfilecookie -i /path/to/inputfile -o /path/to/downloadlog -nc -b
This will set wget running in the background logging its progress to a download log. It will skip downloading any files that already exist, and it will read all the files to download from a file called inputfile. I hope this helps someone 🙂
Categories: wget
automated download, hotfile, wget
Thank you – it was mos helpful as I was unable to figure out the proper ‘post data’ string to send to hotfile.com while trying to record the cookie. Now I have to figure out how to get hotfile.com to send the file and not a manual ‘click this link’ to get the download. It’s a piece of cake with Rapidshare since they use ‘temporary redirection’. Life with computers, it’s never boring.
hi! the second command downloads the files that are listed in “inputfile” and it works ok, or did you just want to get a single file?
didn’t work for me, it just downloaded the html page.
does the html page say you need to log in? if so could be a problem with authentication check your cookie path and see whats inside if it exists at all.
I just had to enable the direct download option, my bad
Excellent tip!
Works like a charm! Low resource use and fast download. Now I wonder if we could use Aria2C to start multiple downloads since by looking at commands they are quite similar to wget. Support multi-threading downloads too!
Any pointers anyone?
Very good! Confirmed Aria2 to be working the same as wget, except …
It does :
1. Multi-threaded downloads.
2. Multi-Downloads.
3. Supports other protocols as well like torrents, http, ftp, metalink, magnet link, etc. But then we are only concerned about hotfile premium here aren’t we?
4. Download speeds have been boosted/optimized compared to the wget method (close to 90-95% of full internet speed).
Requirements:
a. To install the aria2 package :
install aria2
Usage :
a. This will save your login to a cookie on your server:
wget –save-cookies /path/to/hotfilecookie –post-data “returnto=%2F&user=1234567&pass=yourpass&=Login” -O – http://www.hotfile.com/login.php > /dev/null
b. This will start the download from the input file list and log to file in the specified file.
aria2c -c –load-cookies /path/to/hotfilecookie -i /path/to/inputfile -o /path/to/downloadlog -nc -b
NOTE: I suggest trying out step b. first without log file or background execution so you can actually see and understand what is happening.
IMHO: Screen is better than running the process in the background because you can switch to it back and forth as desired. However you will then need to install screen.
Have fun & thanks to the author for the “spark” post!
i’ve tried both of these but only get 22kb files saved. using ubuntu 8.04 server. have a prem hotfile acc. made sure direct downloads was on. any suggestions?
open the files and I bet it will be an error like the file has been removed or that you need to log in! take a look
i want to use this but it said the system cannot find specified
i’m noob
help me plz
Its possible you don’t have wget installed on your system. Try typing just wget alone and what do you get? Also you can try typing locate wget and try running it directly.
thank you for sharing your knowledge to help us out here…my situation is lately, I rarely can get a complete download from hotfile free edition ..99.99% which is no good for rar files…
I have wackget installed and see the wget.exe-196kb’s in the folder, as well have dnloaded another at 317kb’s but neither will open, just a quick command prompt screen
flash, and that’s it.
I assume a cp screen has to open to paste your commands? Can you, will you help with this scenario?
Can you download the file successfully in a regular browser? You mention hotfile free.. I am sure they have anti leeching scripts which are designed to stop this unless you are a paid member. I am sure thats it!
I got problems with this system since early july.. The cookie does, so i’m logged in, but hotfile don’t see the account as premium user, and therefor shows the “choose between regular and premium speed”..
A short snippit:
”
Benutzer:
1788113
”
AND
”
Account:
Kostenlos
”
Kostenlos = Free (don’t know why it gives me the german language site..
But, when I use the browser to login, I am a premium user..
Thank you so much for the reply..forgive my late response…I’m using Chrome and Firefox, Jdownloader, wackget, freerapidnlder..nothing gets past 99%…Hotfile free used to work fine until about 2 wks ago, so… I don’t know. I guess you’re saying that your scripts won’t work either?
Yes.. My script don’t work..
It downloads the .html site that containes the “choose between regular and premium speed” (the site you’ll get if you’re not premium user)..
I think hotfile has changed something in their code – maybe that, it first in the end will check weither the user are premium or not, and the site therefore need a refresh before it will know if the user is premium user..
It a bit weird.. But, that’s my best guess
If in html files you are log in try this:
go to “Edit account” and make sure that you check “direct download”.
Its working!
I’ve just looked into the settings – it was already checked..
I will try a bit myself to find a solution..
Okay – very strange.. I’ve run the script on my homeserver (insted of my webdomain), and the script works just fine..
i’ve looked through all the code multiply times – the code are the exact same..
Can hotfile maybe have blacklisted my domain?
I’ve tried the wget-code directly in the terminal.. Still not working on domain but just fine on homeserver.. I’ve updated the cookie on both servers.. The wget-version is the some on both servers..
Can’t you feed aria2c directly with your login details?
aria2c –http-user=xxxxx –http-passwd=xxxxxx -i file details
This is how it works for RS accounts
I have same probblem as our mwf friend. On my home server and Dedicated #1 – All fine, on Dedicated #2 – Free Account, though it’s premium. Any suggestions?
Something is fucked up in Linux (I use Hetzned dedicated server with Ubuntu 8.10). I gave up with wget. I tried to use elinks browser.
elinks http://hotfile.com
I logged in, and exited with ctrl+c. then
elinks http://link-to-file
opens elinks WITHOUT being logged in!
The same operation on my Mac OS-X works fine. So the problem is not with wget, but with Linux, or at least my dedicated server.