Open
Bug 40106
Opened 25 years ago
Updated 2 years ago
Accelerated Download for files with mirrors (swarming)
Categories
(Firefox :: File Handling, enhancement)
Firefox
File Handling
Tracking
()
NEW
People
(Reporter: netdragon, Unassigned)
References
()
Details
(Keywords: helpwanted, Whiteboard: multiple _different_ sites, not the same one multiple times. (See comment #26))
This is basically an extension of 22796 to add my ideas - but I believe it is
different enough to include as a different bug and should be.
Basically, download accelerator somehow gets all mirrors of a server you are
downloading a file from and switches between servers as the file downloads
depending on the ping (I think). That way you get a superfast download (I once
got 800K on a 10Mb/s total LAN). I think Mozilla should have a default download
accelerator that comes with it which can be replaced, of course, if someone
downloads a replacement from some other company.
Also - you could show download status on the window along with a
clickable advertisement that changes every couple minutes. That way, you can
make money while people download - YEAH!!! - whaddayathink? Am I evil or what?
:)
That could be built on top of the 22796 - download before choosing file.
Updated•25 years ago
|
Status: UNCONFIRMED → NEW
Component: Browser-General → Networking
Ever confirmed: true
Target Milestone: --- → Future
Comment 1•25 years ago
|
||
--> networking, marking future, confirming bug.
Reporter | ||
Comment 4•24 years ago
|
||
I would like to say that I think the download accelerator feature should be a
plugin. Mozilla would ship its default acceleration plugin, or people could
download one from a third party. When you choose to save a file, the Save As
Dialog box would come up. If acceleration is enabled, in the bottom of that box,
all the mirrors would appear and their pings. When you see some good pings (or
all mirrors have been found, or none) you can click "Start Download".
Reporter | ||
Comment 5•24 years ago
|
||
I think this would be best implemented as a default downplugin.
Summary: [RFE]Accelerated Download Include For Netscape → [RFE][Money]Accelerated Download Include For Netscape
Reporter | ||
Comment 6•24 years ago
|
||
Ok, I have thought about this and my opinion is the only reason for
implementing this would be because there is no os-independent Download
Accelerator that I can think of.
Reporter | ||
Comment 8•23 years ago
|
||
BTW - I don't feel this needs to compete with Download Accelerator or any other
extremely high quality download accelerator. Just a little better than a normal
download. Maybe including the most basic features of Download Accelerator such
as resume and mirror finding and switching to replace the old download code.
Comment 9•22 years ago
|
||
*** Bug 159529 has been marked as a duplicate of this bug. ***
Comment 10•22 years ago
|
||
Probably the best place for this bug...
Assignee: nobody → law
Component: Networking → File Handling
QA Contact: benc → sairuh
Comment 11•22 years ago
|
||
*** Bug 144325 has been marked as a duplicate of this bug. ***
Comment 12•22 years ago
|
||
http://www.filemirrors.com/ - source of mirrors (from bug 144325).
Comment 13•22 years ago
|
||
Awesome bug! :D
Change "Component" to *Download Manager* ?
Suggest to change summary to avoid more dupes:
"Support FileMirrors in the Download Manager (Accelerated Downloads)"
Some feature suggestions:
1. ping all servers in mirror-list on start. Then auto-select lowest ping server.
2. rightclick: "recalculate speed"
3. Ability to select a _particular_ server.
4. Pref to allow upload of file-servers to filemirrors.com (or other mirrors).
Updated•22 years ago
|
QA Contact: sairuh → petersen
Summary: [RFE][Money]Accelerated Download Include For Netscape → [Money]Accelerated Download Include For Netscape
Comment 14•22 years ago
|
||
Download accellerator is a resource hog for content providers like me.
I don't have a T1, I have a 256k dsl link. And if I put some average-sized
binary files for download, the least I expect is for someone to use Download
Accellerator to "download my files faster", while in fact what is is really
doing is SATURATING MY FIXED-BANDWIDTH CONNECTION, making it impossible for
others to get the file at the same time.
What download acellerators do, when there is no "mirror" and the file is on a
single server is OPEN SEVERAL SIMULTANEOUS CONNECTIONS to the same file, on the
SAME SERVER, and requesting different "byte-ranges" from the http server,
effectivelly splitting a 10-mb file into ten request for 1-mb byte-ranges of the
same file.
So I am effectivelly serving 10 simultaneous http requests for a single user.
Sounds fair for other users? It is NOT.
And I hate it.
I would hate to see any sort of "download acellerator" that hogs the content
providers' resources into Mozilla. If a third-party wants to develop it and some
self-centered careless individual wants to install it and use it, there is
nothing we can do to stop them, but adding it into Mozilla would be a BAD SIGNAL.
A COUPLE DOZEN SIMULTANEOUS HTTP CONNECTIONS from a single user, just to get
"his" file faster is NOT GOOD NETIZENSHIP.
Reporter | ||
Comment 15•22 years ago
|
||
I agree, this is for 3rd parties. Its important that we have a nice download
manager, but this is overkill and might even be controversial.
Status: NEW → RESOLVED
Closed: 22 years ago
Resolution: --- → WONTFIX
Comment 17•22 years ago
|
||
Killing this bug mainly because of comment 14 may have been overkill. Mozilla
could still use mirrors to find the fastest server(s); download from multiple
servers; but if only ONE SERVER is found/available, THEN allow only ONE CONNECTION.
Suggest/Request: REOPEN
Comment 18•22 years ago
|
||
Ok, I can't believe someome closed this bug as wontfix.
There is absolutly no point in adding a download manager to Mozilla without
resumed downloads and file segmenting. I know Getright does it. I used
Getright for years.
Mozilla broke Getright. At least that is their stance. So, I waited until it
looked like Mozilla was going to support those features before I finally
switched. Now it seems the feature is being dropped because of some lame
comment from a web site admin. Multiple file segments are a MUST. They speed
up downloads better than any other technology I have used.
Reporter | ||
Comment 19•22 years ago
|
||
Kevin: For Getright to say we broke their commercial software is pure and
utter bs. http://www.getright.com/whatsnew.html says that they have plugins to
work with Mozilla. There is also http://www.downloadaccelerator.com/
Peter: A stance on WONTFIX can change in the future. It is definately not
something that is set in stone, but it is where we think it belongs at this
time. If you can get the owner of this component to back adding this, it will
be reopened.
It could also be placed on Mozdev if some volunteer wants to take this up.
Reporter | ||
Comment 20•22 years ago
|
||
*** Bug 173554 has been marked as a duplicate of this bug. ***
Reporter | ||
Comment 21•22 years ago
|
||
Adding swarming to the topic because that is a nickname for this feature
request.
Summary: [Money]Accelerated Download Include For Netscape → [Money]Accelerated Download Include For Netscape (swarming)
Comment 22•21 years ago
|
||
*** Bug 237571 has been marked as a duplicate of this bug. ***
Comment 23•20 years ago
|
||
*** Bug 255053 has been marked as a duplicate of this bug. ***
Comment 24•20 years ago
|
||
*** Bug 75360 has been marked as a duplicate of this bug. ***
Comment 25•20 years ago
|
||
reopening per bug 75360 comment #28 - it's already assigned to "future" target
milestone, so shouldn't do much harm...
Status: VERIFIED → REOPENED
Resolution: WONTFIX → ---
Comment 26•20 years ago
|
||
re comment #14 -- as an FTP server admin, I totally agree. If this feature is
implemented, it should _only_ attempt to access multiple _different_ sites, not
the same one multiple times.
Reporter | ||
Comment 27•20 years ago
|
||
Updating out-of-date summary, assigning to module owner, adding portion of
comment #26 to status whiteboard. What this will do is actually spread out the
load on multiple servers.
Putting into consideration what is said in comment #26, otherwise we'll probably
be helping the web, because when people go to download files, they often choose
the first server in the list. This will help spread out the bandwidth
requirements for downloading a file.
Maybe connections should be momentarily dropped if they get below a certain
speed. That is because the server is probably overloaded and we could be keeping
it from catching up (DOSing it unintentionally). Is this something we should do?
We can show a tree in the download dialog for this feature.
RE comment #18 Do we now support resume yet?
Assignee: law → file-handling
Status: REOPENED → NEW
QA Contact: chrispetersen → ian
Summary: [Money]Accelerated Download Include For Netscape (swarming) → Accelerated Download for files with mirrors (swarming)
Whiteboard: multiple _different_ sites, not the same one multiple times. (See comment #26)
Comment 28•20 years ago
|
||
The filemirrors.com bit sounds kindof awkward and problematic. How would you
handle layout/other html changes? Do (or would) they offer search results in xml
or some other constant format? If so, would they care to get little or no credit
for providing the mirror list as this would run behind the scenes? I can't
imagine any organization being ok with that unless there's some kindof service
fee and I don't believe that would be a reasonable request of Mozilla.org.
The only other way I could see this done reliably would be by getting a list of
addresses through an additional (?) gethostbyname() call. Obviously, anything
behind an Alteon/Foundry/other would be excluded. Aside from rather large
mirrors (debian, redhat), how many sites actually make use of a round-robin type
configuration? I would guess the number to be rather low and if that's the case,
would it still be enough to warrant this feature?
As far as the general idea behind these download accelerators goes.. I strongly
agree with comment 14 and truely hope that abusive behavior never finds it's way
into Mozilla products.
Comment 29•20 years ago
|
||
Just wondering if this bug is even still a good one. This one started in 2000.
Were "bit torrents" even around then? Maybe this should be moved to a bit
torrent creation bug.
Reporter | ||
Comment 30•20 years ago
|
||
This has nothing to do with bittorrent.
Comment 31•20 years ago
|
||
What I meant is that things have changed since 2000, and maybe bit torrent can
take the place of this function. Over.
Reporter | ||
Comment 32•20 years ago
|
||
I see what you are saying. Like that Fedora uses bittorrent for downloads, and
we could integrate the ability to use torrents as mirroring capabilities. I
guess that would be a different bug.
Unfortunately, I don't know enough about how accelerated downloads work at the
lowest level to know whether or not this bug is something that is doable or not.
Comment 33•20 years ago
|
||
*** Bug 269330 has been marked as a duplicate of this bug. ***
Comment 34•20 years ago
|
||
*** Bug 269779 has been marked as a duplicate of this bug. ***
Comment 35•20 years ago
|
||
*** Bug 269779 has been marked as a duplicate of this bug. ***
Comment 36•20 years ago
|
||
Will this bug be perused or not? It is definitely a worth while to have built in
mirroring capabilities. Of course if there is only one mirror, several
connections should not be opened to it (as they are in some download accelerators).
Also, would a feature that allows capping (in KB) of certain downloads be of
interest to more than just I?
Comment 37•19 years ago
|
||
*** Bug 322049 has been marked as a duplicate of this bug. ***
Updated•19 years ago
|
Priority: P3 → --
Target Milestone: Future → ---
Comment 38•19 years ago
|
||
This bug may depend on bug 230870.
Comment 39•19 years ago
|
||
there are a number of posibilities. to this. Certainly with larger files, they can be downloaded in chunks from multiple servers. Assuming auto-resume works, this is not dificult to do.
More importantly is general browsing speed. If browsers compete this is one of the critical issues. One way to accelerate downloads of websites is swarming. HTTP/1.1 supports Content-MD5, and MOST website objects are < 64kb (the max size for a UDP packet).
A very simple addition to web grabbing could be a sort of swarm handling of websites. a browswer would have to request the MD5's of files (say index.html) to begin with from the original server. but this isn't a big burden on the server.
If downloading is slow (error codes: 408, 503, 504) it could reach for a known tracke-tracker (tracker-trackers are easy to solve, and could even be pure p2p, since a charitable or comercial md5/ip lookup and registry service is trivial to build). and from that tracker gain a list of browsers that have registered this MD5 in cache. sending a single small udp request to any of these peers, and returning a single LARGE udp packet with the requested file (zlib is already in firefox).
UDP firewall problems have largely been solved these days.
Esentialy this would eliminate the slashdot/dig effect. Also since the MD5's have to be retrieved from the actual server, it does not compromise their ability to log trafic, or rotate ads.
(tracker-trackers could also benefit by being agrigates of web-traffic statistics, since this feature would be used on THE busyiest websites)
Comment 40•18 years ago
|
||
It's 2006 and, as far as I know, 0 Web browsers support segmented / accelerated / multi-threaded downloads. I believe I started using a download manager for this over a decade ago. This was handy on dialup, but on dsl/cable/fiber it's a must. If Firefox supported accelerated downloads, the feature would become more mainstream. This would be a highly visible new feature as it would improve download speeds greatly over other browsers.
(To hawk my own solution), I've developed Metalink (bug 331979) which is an XML file used by download managers that contains mirror information for segmented downloads along with checksums & other useful info (p2p, OS, language). One link that has all mirror & checksum info for a file(s).
There is a screen capture available of a Metalink being used with GetRight at http://www.metalinker.org/implementation.html#video
Comment 41•18 years ago
|
||
I'll like mention that Ant Bryan (comment #40) has a better point than those mirror finding stuff other as said. It's also may be a relief for the website administrator as metalink give flexibility for them as well as a simple interface for the user.
this should be a way to go.
Comment 42•18 years ago
|
||
Re: comment 39
>a charitable or comercial md5/ip lookup and registry service
Re: comment 14
>The filemirrors.com bit sounds kindof awkward and problematic. How would you
>handle layout/other html changes? Do (or would) they offer search results in xml
>or some other constant format? If so, would they care to get little or no credit
>for providing the mirror list as this would run behind the scenes? I can't
>imagine any organization being ok with that unless there's some kindof service
>fee and I don't believe that would be a reasonable request of Mozilla.org.
metamirrors ( http://www.metamirrors.com/ ) links checksums and URLs, similar to filemirrors (which does not use checksums - big security risk). Results are returned as .metalink files (XML format used by download managers). Partial file checksums can be included in .metalinks, to protect against malicious/defective mirrors (so you don't have to wait for the whole download to complete to find out that the checksum doesn't match and you have to start over).
Unrelated to metamirrors, DownThemAll (Firefox download manager extension which is also adding metalink support) could be a great gateway for people wanting to try out accelerated downloads without installing a separate app. It doesn't support multiple mirrors yet, but that's being added.
Comment 43•16 years ago
|
||
There is a beta Internet browser based on Firefox called Wyzo that has this implemented.
http://www.wyzo.com/
Updated•15 years ago
|
Assignee: file-handling → nobody
QA Contact: ian → file-handling
Updated•8 years ago
|
Product: Core → Firefox
Version: Trunk → unspecified
Updated•2 years ago
|
Severity: normal → S3
You need to log in
before you can comment on or make changes to this bug.
Description
•